Academic literature on the topic 'Protein language models'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Protein language models.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Protein language models"
Tang, Lin. "Protein language models using convolutions." Nature Methods 21, no. 4 (April 2024): 550. http://dx.doi.org/10.1038/s41592-024-02252-3.
Full textAli, Sarwan, Prakash Chourasia, and Murray Patterson. "When Protein Structure Embedding Meets Large Language Models." Genes 15, no. 1 (December 23, 2023): 25. http://dx.doi.org/10.3390/genes15010025.
Full textFerruz, Noelia, and Birte Höcker. "Controllable protein design with language models." Nature Machine Intelligence 4, no. 6 (June 2022): 521–32. http://dx.doi.org/10.1038/s42256-022-00499-z.
Full textLi, Xiang, Zhuoyu Wei, Yueran Hu, and Xiaolei Zhu. "GraphNABP: Identifying nucleic acid-binding proteins with protein graphs and protein language models." International Journal of Biological Macromolecules 280 (November 2024): 135599. http://dx.doi.org/10.1016/j.ijbiomac.2024.135599.
Full textSingh, Arunima. "Protein language models guide directed antibody evolution." Nature Methods 20, no. 6 (June 2023): 785. http://dx.doi.org/10.1038/s41592-023-01924-w.
Full textTran, Chau, Siddharth Khadkikar, and Aleksey Porollo. "Survey of Protein Sequence Embedding Models." International Journal of Molecular Sciences 24, no. 4 (February 14, 2023): 3775. http://dx.doi.org/10.3390/ijms24043775.
Full textPokharel, Suresh, Pawel Pratyush, Hamid D. Ismail, Junfeng Ma, and Dukka B. KC. "Integrating Embeddings from Multiple Protein Language Models to Improve Protein O-GlcNAc Site Prediction." International Journal of Molecular Sciences 24, no. 21 (November 6, 2023): 16000. http://dx.doi.org/10.3390/ijms242116000.
Full textWang, Wenkai, Zhenling Peng, and Jianyi Yang. "Single-sequence protein structure prediction using supervised transformer protein language models." Nature Computational Science 2, no. 12 (December 19, 2022): 804–14. http://dx.doi.org/10.1038/s43588-022-00373-3.
Full textPang, Yihe, and Bin Liu. "IDP-LM: Prediction of protein intrinsic disorder and disorder functions based on language models." PLOS Computational Biology 19, no. 11 (November 22, 2023): e1011657. http://dx.doi.org/10.1371/journal.pcbi.1011657.
Full textWeber, Leon, Kirsten Thobe, Oscar Arturo Migueles Lozano, Jana Wolf, and Ulf Leser. "PEDL: extracting protein–protein associations using deep language models and distant supervision." Bioinformatics 36, Supplement_1 (July 1, 2020): i490—i498. http://dx.doi.org/10.1093/bioinformatics/btaa430.
Full textDissertations / Theses on the topic "Protein language models"
Meynard, Barthélémy. "Language Models towards Conditional Generative Modelsof Proteins Sequences." Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS195.
Full textThis thesis explores the intersection of artificial intelligence (AI) and biology, focusing on how generative models can innovate in protein sequence design. Our research unfolds in three distinct yet interconnected stages, each building upon the insights of the previous to enhance the model's applicability and performance in protein engineering.We begin by examining what makes a generative model effective for protein sequences. In our first study, "Interpretable Pairwise Distillations for Generative Protein Sequence Models," we compare complex neural network models to simpler, pairwise distribution models. This comparison highlights that deep learning strategy mainly model second order interaction, highlighting their fundamental role in modeling proteins family.In a second part, we try to expand this principle of using second order interaction to inverse folding. We explore structure conditioning in "Uncovering Sequence Diversity from a Known Protein Structure" Here, we present InvMSAFold, a method that produces diverse protein sequences designed to fold into a specific structure. This approach tries to combines two different tradition of proteins modeling: the MSA based models that try to capture the entire fitness landscape and the inverse folding types of model that focus on recovering one specific sequence. This is a first step towards the possibility of conditioning the fitness landscape by considering the protein's final structure in the design process, enabling the generation of sequences that are not only diverse but also maintain their intended structural integrity. Finally, we delve into sequence conditioning with "Generating Interacting Protein Sequences using Domain-to-Domain Translation." This study introduces a novel approach to generate protein sequences that can interact with specific other proteins. By treating this as a translation problem, similar to methods used in language processing, we create sequences with intended functionalities. Furthermore, we address the critical challenge of T-cell receptor (TCR) and epitope interaction prediction in "TULIP—a Transformer based Unsupervised Language model for Interacting Peptides and T-cell receptors." This study introduces an unsupervised learning approach to accurately predict TCR-epitope bindings, overcoming limitations in data quality and training bias inherent in previous models. These advancements underline the potential of sequence conditioning in creating functionally specific and interaction-aware protein designs
Hladiš, Matej. "Réseaux de neurones en graphes et modèle de langage des protéines pour révéler le code combinatoire de l'olfaction." Electronic Thesis or Diss., Université Côte d'Azur, 2024. http://www.theses.fr/2024COAZ5024.
Full textMammals identify and interpret a myriad of olfactory stimuli using a complex coding mechanism involving interactions between odorant molecules and hundreds of olfactory receptors (ORs). These interactions generate unique combinations of activated receptors, called the combinatorial code, which the human brain interprets as the sensation we call smell. Until now, the vast number of possible receptor-molecule combinations have prevented a large-scale experimental study of this code and its link to odor perception. Therefore, revealing this code is crucial to answering the long-term question of how we perceive our intricate chemical environment. ORs belong to the class A of G protein-coupled receptors (GPCRs) and constitute the largest known multigene family. To systematically study olfactory coding, we develop M2OR, a comprehensive database compiling the last 25 years of OR bioassays. Using this dataset, a tailored deep learning model is designed and trained. It combines the [CLS] token embedding from a protein language model with graph neural networks and multi-head attention. This model predicts the activation of ORs by odorants and reveals the resulting combinatorial code for any odorous molecule. This approach is refined by developing a novel model capable of predicting the activity of an odorant at a specific concentration, subsequently allowing the estimation of the EC50 value for any OR-odorant pair. Finally, the combinatorial codes derived from both models are used to predict the odor perception of molecules. By incorporating inductive biases inspired by olfactory coding theory, a machine learning model based on these codes outperforms the current state-of-the-art in smell prediction. To the best of our knowledge, this is the most comprehensive and successful application of combinatorial coding to odor quality prediction. Overall, this work provides a link between the complex molecule-receptor interactions and human perception
Books on the topic "Protein language models"
Bovington, Sue. Tigris/Thames. [United Kingdom]: [Sue Bovington], 2011.
Find full textBeatriz, Solís Leree, ed. La Ley televisa y la lucha por el poder en México. México, D.F: Universidad Autónoma Metropolitana, Unidad Xochimilco, 2009.
Find full textYoshikawa, Saeko. William Wordsworth and Modern Travel. Liverpool University Press, 2020. http://dx.doi.org/10.3828/liverpool/9781789621181.001.0001.
Full textHardiman, David. The Nonviolent Struggle for Indian Freedom, 1905-19. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190920678.001.0001.
Full textMcNally, Michael D. Defend the Sacred. Princeton University Press, 2020. http://dx.doi.org/10.23943/princeton/9780691190907.001.0001.
Full textMeddings, Jennifer, Vineet Chopra, and Sanjay Saint. Preventing Hospital Infections. 2nd ed. Oxford University Press, 2021. http://dx.doi.org/10.1093/med/9780197509159.001.0001.
Full textHalvorsen, Tar, and Peter Vale. One World, Many Knowledges: Regional experiences and cross-regional links in higher education. African Minds, 2016. http://dx.doi.org/10.47622/978-0-620-55789-4.
Full textBook chapters on the topic "Protein language models"
Xu, Yaoyao, Xinjian Zhao, Xiaozhuang Song, Benyou Wang, and Tianshu Yu. "Boosting Protein Language Models with Negative Sample Mining." In Lecture Notes in Computer Science, 199–214. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-70381-2_13.
Full textZhao, Junming, Chao Zhang, and Yunan Luo. "Contrastive Fitness Learning: Reprogramming Protein Language Models for Low-N Learning of Protein Fitness Landscape." In Lecture Notes in Computer Science, 470–74. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-1-0716-3989-4_55.
Full textGhazikhani, Hamed, and Gregory Butler. "A Study on the Application of Protein Language Models in the Analysis of Membrane Proteins." In Distributed Computing and Artificial Intelligence, Special Sessions, 19th International Conference, 147–52. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-23210-7_14.
Full textZeng, Shuai, Duolin Wang, Lei Jiang, and Dong Xu. "Prompt-Based Learning on Large Protein Language Models Improves Signal Peptide Prediction." In Lecture Notes in Computer Science, 400–405. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-1-0716-3989-4_40.
Full textFernández, Diego, Álvaro Olivera-Nappa, Roberto Uribe-Paredes, and David Medina-Ortiz. "Exploring Machine Learning Algorithms and Protein Language Models Strategies to Develop Enzyme Classification Systems." In Bioinformatics and Biomedical Engineering, 307–19. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-34953-9_24.
Full textPaaß, Gerhard, and Sven Giesselbach. "Foundation Models for Speech, Images, Videos, and Control." In Artificial Intelligence: Foundations, Theory, and Algorithms, 313–82. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-23190-2_7.
Full textShan, Kaixuan, Xiankun Zhang, and Chen Song. "Prediction of Protein-DNA Binding Sites Based on Protein Language Model and Deep Learning." In Advanced Intelligent Computing in Bioinformatics, 314–25. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-5692-6_28.
Full textMatsiunova, Antonina. "Semantic opposition of US versus THEM in late 2020 Russian-language Belarusian discourse." In Protest in Late Modern Societies, 42–55. London: Routledge, 2023. http://dx.doi.org/10.4324/9781003270065-4.
Full textMullett, Michael. "Language and Action in Peasant Revolts." In Popular Culture and Popular Protest in Late Medieval and Early Modern Europe, 71–109. London: Routledge, 2021. http://dx.doi.org/10.4324/9781003188858-3.
Full textFan, Jianye, Xiaofeng Liu, Shoubin Dong, and Jinlong Hu. "Enriching Pre-trained Language Model with Dependency Syntactic Information for Chemical-Protein Interaction Extraction." In Lecture Notes in Computer Science, 58–69. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-56725-5_5.
Full textConference papers on the topic "Protein language models"
Jiang, Yanfeng, Ning Sun, Zhengxian Lu, Shuang Peng, Yi Zhang, Fei Yang, and Tao Li. "MEFold: Memory-Efficient Optimization for Protein Language Models via Chunk and Quantization." In 2024 International Joint Conference on Neural Networks (IJCNN), 1–8. IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10651470.
Full textKim, Yunsoo. "Foundation Model for Biomedical Graphs: Integrating Knowledge Graphs and Protein Structures to Large Language Models." In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), 346–55. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.acl-srw.30.
Full textEngel, Ryan, and Gilchan Park. "Evaluating Large Language Models for Predicting Protein Behavior under Radiation Exposure and Disease Conditions." In Proceedings of the 23rd Workshop on Biomedical Natural Language Processing, 427–39. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.bionlp-1.34.
Full textlin, ming. "DPPred-indel:a pathogenic inframe indel prediction method based on biological language models and fusion of DNA and protein features." In 2024 Fourth International Conference on Biomedicine and Bioinformatics Engineering (ICBBE 2024), edited by Pier Paolo Piccaluga, Ahmed El-Hashash, and Xiangqian Guo, 67. SPIE, 2024. http://dx.doi.org/10.1117/12.3044406.
Full textZeinalipour, Kamyar, Neda Jamshidi, Monica Bianchini, Marco Maggini, and Marco Gori. "Design Proteins Using Large Language Models: Enhancements and Comparative Analyses." In Proceedings of the 1st Workshop on Language + Molecules (L+M 2024), 34–47. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.langmol-1.5.
Full textPeng, Shuang, Fei Yang, Ning Sun, Sheng Chen, Yanfeng Jiang, and Aimin Pan. "Exploring Post-Training Quantization of Protein Language Models." In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385775.
Full textGao, Liyuan, Kyler Shu, Jun Zhang, and Victor S. Sheng. "Explainable Transcription Factor Prediction with Protein Language Models." In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385498.
Full textŠkrhák, Vít, Kamila Riedlova, Marian Novotný, and David Hoksza. "Cryptic binding site prediction with protein language models." In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385497.
Full textWu, Xinbo, Alexandru Hanganu, Ayuko Hoshino, and Lav R. Varshney. "Source Identification for Exosomal Communication via Protein Language Models." In 2022 IEEE 32nd International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2022. http://dx.doi.org/10.1109/mlsp55214.2022.9943418.
Full textIinuma, Naoki, Makoto Miwa, and Yutaka Sasaki. "Improving Supervised Drug-Protein Relation Extraction with Distantly Supervised Models." In Proceedings of the 21st Workshop on Biomedical Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.bionlp-1.16.
Full textReports on the topic "Protein language models"
Wu, Jyun-Jie. Improving Predictive Efficiency and Literature Quality Assessment for Lung Cancer Complications Post-Proton Therapy Through Large Language Models and Meta-Analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, August 2024. http://dx.doi.org/10.37766/inplasy2024.8.0103.
Full textShani, Uri, Lynn Dudley, Alon Ben-Gal, Menachem Moshelion, and Yajun Wu. Root Conductance, Root-soil Interface Water Potential, Water and Ion Channel Function, and Tissue Expression Profile as Affected by Environmental Conditions. United States Department of Agriculture, October 2007. http://dx.doi.org/10.32747/2007.7592119.bard.
Full textMelnyk, Iurii. JUSTIFICATION OF OCCUPATION IN GERMAN (1938) AND RUSSIAN (2014) MEDIA: SUBSTITUTION OF AGGRESSOR AND VICTIM. Ivan Franko National University of Lviv, March 2021. http://dx.doi.org/10.30970/vjo.2021.50.11101.
Full textYatsymirska, Mariya. KEY IMPRESSIONS OF 2020 IN JOURNALISTIC TEXTS. Ivan Franko National University of Lviv, March 2021. http://dx.doi.org/10.30970/vjo.2021.50.11107.
Full textOr, Etti, David Galbraith, and Anne Fennell. Exploring mechanisms involved in grape bud dormancy: Large-scale analysis of expression reprogramming following controlled dormancy induction and dormancy release. United States Department of Agriculture, December 2002. http://dx.doi.org/10.32747/2002.7587232.bard.
Full text