Letteratura scientifica selezionata sul tema "Protein language models"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Protein language models".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Articoli di riviste sul tema "Protein language models"
Tang, Lin. "Protein language models using convolutions". Nature Methods 21, n. 4 (aprile 2024): 550. http://dx.doi.org/10.1038/s41592-024-02252-3.
Testo completoAli, Sarwan, Prakash Chourasia e Murray Patterson. "When Protein Structure Embedding Meets Large Language Models". Genes 15, n. 1 (23 dicembre 2023): 25. http://dx.doi.org/10.3390/genes15010025.
Testo completoFerruz, Noelia, e Birte Höcker. "Controllable protein design with language models". Nature Machine Intelligence 4, n. 6 (giugno 2022): 521–32. http://dx.doi.org/10.1038/s42256-022-00499-z.
Testo completoLi, Xiang, Zhuoyu Wei, Yueran Hu e Xiaolei Zhu. "GraphNABP: Identifying nucleic acid-binding proteins with protein graphs and protein language models". International Journal of Biological Macromolecules 280 (novembre 2024): 135599. http://dx.doi.org/10.1016/j.ijbiomac.2024.135599.
Testo completoSingh, Arunima. "Protein language models guide directed antibody evolution". Nature Methods 20, n. 6 (giugno 2023): 785. http://dx.doi.org/10.1038/s41592-023-01924-w.
Testo completoTran, Chau, Siddharth Khadkikar e Aleksey Porollo. "Survey of Protein Sequence Embedding Models". International Journal of Molecular Sciences 24, n. 4 (14 febbraio 2023): 3775. http://dx.doi.org/10.3390/ijms24043775.
Testo completoPokharel, Suresh, Pawel Pratyush, Hamid D. Ismail, Junfeng Ma e Dukka B. KC. "Integrating Embeddings from Multiple Protein Language Models to Improve Protein O-GlcNAc Site Prediction". International Journal of Molecular Sciences 24, n. 21 (6 novembre 2023): 16000. http://dx.doi.org/10.3390/ijms242116000.
Testo completoWang, Wenkai, Zhenling Peng e Jianyi Yang. "Single-sequence protein structure prediction using supervised transformer protein language models". Nature Computational Science 2, n. 12 (19 dicembre 2022): 804–14. http://dx.doi.org/10.1038/s43588-022-00373-3.
Testo completoPang, Yihe, e Bin Liu. "IDP-LM: Prediction of protein intrinsic disorder and disorder functions based on language models". PLOS Computational Biology 19, n. 11 (22 novembre 2023): e1011657. http://dx.doi.org/10.1371/journal.pcbi.1011657.
Testo completoWeber, Leon, Kirsten Thobe, Oscar Arturo Migueles Lozano, Jana Wolf e Ulf Leser. "PEDL: extracting protein–protein associations using deep language models and distant supervision". Bioinformatics 36, Supplement_1 (1 luglio 2020): i490—i498. http://dx.doi.org/10.1093/bioinformatics/btaa430.
Testo completoTesi sul tema "Protein language models"
Meynard, Barthélémy. "Language Models towards Conditional Generative Modelsof Proteins Sequences". Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS195.
Testo completoThis thesis explores the intersection of artificial intelligence (AI) and biology, focusing on how generative models can innovate in protein sequence design. Our research unfolds in three distinct yet interconnected stages, each building upon the insights of the previous to enhance the model's applicability and performance in protein engineering.We begin by examining what makes a generative model effective for protein sequences. In our first study, "Interpretable Pairwise Distillations for Generative Protein Sequence Models," we compare complex neural network models to simpler, pairwise distribution models. This comparison highlights that deep learning strategy mainly model second order interaction, highlighting their fundamental role in modeling proteins family.In a second part, we try to expand this principle of using second order interaction to inverse folding. We explore structure conditioning in "Uncovering Sequence Diversity from a Known Protein Structure" Here, we present InvMSAFold, a method that produces diverse protein sequences designed to fold into a specific structure. This approach tries to combines two different tradition of proteins modeling: the MSA based models that try to capture the entire fitness landscape and the inverse folding types of model that focus on recovering one specific sequence. This is a first step towards the possibility of conditioning the fitness landscape by considering the protein's final structure in the design process, enabling the generation of sequences that are not only diverse but also maintain their intended structural integrity. Finally, we delve into sequence conditioning with "Generating Interacting Protein Sequences using Domain-to-Domain Translation." This study introduces a novel approach to generate protein sequences that can interact with specific other proteins. By treating this as a translation problem, similar to methods used in language processing, we create sequences with intended functionalities. Furthermore, we address the critical challenge of T-cell receptor (TCR) and epitope interaction prediction in "TULIP—a Transformer based Unsupervised Language model for Interacting Peptides and T-cell receptors." This study introduces an unsupervised learning approach to accurately predict TCR-epitope bindings, overcoming limitations in data quality and training bias inherent in previous models. These advancements underline the potential of sequence conditioning in creating functionally specific and interaction-aware protein designs
Hladiš, Matej. "Réseaux de neurones en graphes et modèle de langage des protéines pour révéler le code combinatoire de l'olfaction". Electronic Thesis or Diss., Université Côte d'Azur, 2024. http://www.theses.fr/2024COAZ5024.
Testo completoMammals identify and interpret a myriad of olfactory stimuli using a complex coding mechanism involving interactions between odorant molecules and hundreds of olfactory receptors (ORs). These interactions generate unique combinations of activated receptors, called the combinatorial code, which the human brain interprets as the sensation we call smell. Until now, the vast number of possible receptor-molecule combinations have prevented a large-scale experimental study of this code and its link to odor perception. Therefore, revealing this code is crucial to answering the long-term question of how we perceive our intricate chemical environment. ORs belong to the class A of G protein-coupled receptors (GPCRs) and constitute the largest known multigene family. To systematically study olfactory coding, we develop M2OR, a comprehensive database compiling the last 25 years of OR bioassays. Using this dataset, a tailored deep learning model is designed and trained. It combines the [CLS] token embedding from a protein language model with graph neural networks and multi-head attention. This model predicts the activation of ORs by odorants and reveals the resulting combinatorial code for any odorous molecule. This approach is refined by developing a novel model capable of predicting the activity of an odorant at a specific concentration, subsequently allowing the estimation of the EC50 value for any OR-odorant pair. Finally, the combinatorial codes derived from both models are used to predict the odor perception of molecules. By incorporating inductive biases inspired by olfactory coding theory, a machine learning model based on these codes outperforms the current state-of-the-art in smell prediction. To the best of our knowledge, this is the most comprehensive and successful application of combinatorial coding to odor quality prediction. Overall, this work provides a link between the complex molecule-receptor interactions and human perception
Libri sul tema "Protein language models"
Bovington, Sue. Tigris/Thames. [United Kingdom]: [Sue Bovington], 2011.
Cerca il testo completoBeatriz, Solís Leree, a cura di. La Ley televisa y la lucha por el poder en México. México, D.F: Universidad Autónoma Metropolitana, Unidad Xochimilco, 2009.
Cerca il testo completoYoshikawa, Saeko. William Wordsworth and Modern Travel. Liverpool University Press, 2020. http://dx.doi.org/10.3828/liverpool/9781789621181.001.0001.
Testo completoHardiman, David. The Nonviolent Struggle for Indian Freedom, 1905-19. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190920678.001.0001.
Testo completoMcNally, Michael D. Defend the Sacred. Princeton University Press, 2020. http://dx.doi.org/10.23943/princeton/9780691190907.001.0001.
Testo completoMeddings, Jennifer, Vineet Chopra e Sanjay Saint. Preventing Hospital Infections. 2a ed. Oxford University Press, 2021. http://dx.doi.org/10.1093/med/9780197509159.001.0001.
Testo completoHalvorsen, Tar, e Peter Vale. One World, Many Knowledges: Regional experiences and cross-regional links in higher education. African Minds, 2016. http://dx.doi.org/10.47622/978-0-620-55789-4.
Testo completoCapitoli di libri sul tema "Protein language models"
Xu, Yaoyao, Xinjian Zhao, Xiaozhuang Song, Benyou Wang e Tianshu Yu. "Boosting Protein Language Models with Negative Sample Mining". In Lecture Notes in Computer Science, 199–214. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-70381-2_13.
Testo completoZhao, Junming, Chao Zhang e Yunan Luo. "Contrastive Fitness Learning: Reprogramming Protein Language Models for Low-N Learning of Protein Fitness Landscape". In Lecture Notes in Computer Science, 470–74. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-1-0716-3989-4_55.
Testo completoGhazikhani, Hamed, e Gregory Butler. "A Study on the Application of Protein Language Models in the Analysis of Membrane Proteins". In Distributed Computing and Artificial Intelligence, Special Sessions, 19th International Conference, 147–52. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-23210-7_14.
Testo completoZeng, Shuai, Duolin Wang, Lei Jiang e Dong Xu. "Prompt-Based Learning on Large Protein Language Models Improves Signal Peptide Prediction". In Lecture Notes in Computer Science, 400–405. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-1-0716-3989-4_40.
Testo completoFernández, Diego, Álvaro Olivera-Nappa, Roberto Uribe-Paredes e David Medina-Ortiz. "Exploring Machine Learning Algorithms and Protein Language Models Strategies to Develop Enzyme Classification Systems". In Bioinformatics and Biomedical Engineering, 307–19. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-34953-9_24.
Testo completoPaaß, Gerhard, e Sven Giesselbach. "Foundation Models for Speech, Images, Videos, and Control". In Artificial Intelligence: Foundations, Theory, and Algorithms, 313–82. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-23190-2_7.
Testo completoShan, Kaixuan, Xiankun Zhang e Chen Song. "Prediction of Protein-DNA Binding Sites Based on Protein Language Model and Deep Learning". In Advanced Intelligent Computing in Bioinformatics, 314–25. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-5692-6_28.
Testo completoMatsiunova, Antonina. "Semantic opposition of US versus THEM in late 2020 Russian-language Belarusian discourse". In Protest in Late Modern Societies, 42–55. London: Routledge, 2023. http://dx.doi.org/10.4324/9781003270065-4.
Testo completoMullett, Michael. "Language and Action in Peasant Revolts". In Popular Culture and Popular Protest in Late Medieval and Early Modern Europe, 71–109. London: Routledge, 2021. http://dx.doi.org/10.4324/9781003188858-3.
Testo completoFan, Jianye, Xiaofeng Liu, Shoubin Dong e Jinlong Hu. "Enriching Pre-trained Language Model with Dependency Syntactic Information for Chemical-Protein Interaction Extraction". In Lecture Notes in Computer Science, 58–69. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-56725-5_5.
Testo completoAtti di convegni sul tema "Protein language models"
Jiang, Yanfeng, Ning Sun, Zhengxian Lu, Shuang Peng, Yi Zhang, Fei Yang e Tao Li. "MEFold: Memory-Efficient Optimization for Protein Language Models via Chunk and Quantization". In 2024 International Joint Conference on Neural Networks (IJCNN), 1–8. IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10651470.
Testo completoKim, Yunsoo. "Foundation Model for Biomedical Graphs: Integrating Knowledge Graphs and Protein Structures to Large Language Models". In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), 346–55. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.acl-srw.30.
Testo completoEngel, Ryan, e Gilchan Park. "Evaluating Large Language Models for Predicting Protein Behavior under Radiation Exposure and Disease Conditions". In Proceedings of the 23rd Workshop on Biomedical Natural Language Processing, 427–39. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.bionlp-1.34.
Testo completolin, ming. "DPPred-indel:a pathogenic inframe indel prediction method based on biological language models and fusion of DNA and protein features". In 2024 Fourth International Conference on Biomedicine and Bioinformatics Engineering (ICBBE 2024), a cura di Pier Paolo Piccaluga, Ahmed El-Hashash e Xiangqian Guo, 67. SPIE, 2024. http://dx.doi.org/10.1117/12.3044406.
Testo completoZeinalipour, Kamyar, Neda Jamshidi, Monica Bianchini, Marco Maggini e Marco Gori. "Design Proteins Using Large Language Models: Enhancements and Comparative Analyses". In Proceedings of the 1st Workshop on Language + Molecules (L+M 2024), 34–47. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.langmol-1.5.
Testo completoPeng, Shuang, Fei Yang, Ning Sun, Sheng Chen, Yanfeng Jiang e Aimin Pan. "Exploring Post-Training Quantization of Protein Language Models". In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385775.
Testo completoGao, Liyuan, Kyler Shu, Jun Zhang e Victor S. Sheng. "Explainable Transcription Factor Prediction with Protein Language Models". In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385498.
Testo completoŠkrhák, Vít, Kamila Riedlova, Marian Novotný e David Hoksza. "Cryptic binding site prediction with protein language models". In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385497.
Testo completoWu, Xinbo, Alexandru Hanganu, Ayuko Hoshino e Lav R. Varshney. "Source Identification for Exosomal Communication via Protein Language Models". In 2022 IEEE 32nd International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2022. http://dx.doi.org/10.1109/mlsp55214.2022.9943418.
Testo completoIinuma, Naoki, Makoto Miwa e Yutaka Sasaki. "Improving Supervised Drug-Protein Relation Extraction with Distantly Supervised Models". In Proceedings of the 21st Workshop on Biomedical Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.bionlp-1.16.
Testo completoRapporti di organizzazioni sul tema "Protein language models"
Wu, Jyun-Jie. Improving Predictive Efficiency and Literature Quality Assessment for Lung Cancer Complications Post-Proton Therapy Through Large Language Models and Meta-Analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, agosto 2024. http://dx.doi.org/10.37766/inplasy2024.8.0103.
Testo completoShani, Uri, Lynn Dudley, Alon Ben-Gal, Menachem Moshelion e Yajun Wu. Root Conductance, Root-soil Interface Water Potential, Water and Ion Channel Function, and Tissue Expression Profile as Affected by Environmental Conditions. United States Department of Agriculture, ottobre 2007. http://dx.doi.org/10.32747/2007.7592119.bard.
Testo completoMelnyk, Iurii. JUSTIFICATION OF OCCUPATION IN GERMAN (1938) AND RUSSIAN (2014) MEDIA: SUBSTITUTION OF AGGRESSOR AND VICTIM. Ivan Franko National University of Lviv, marzo 2021. http://dx.doi.org/10.30970/vjo.2021.50.11101.
Testo completoYatsymirska, Mariya. KEY IMPRESSIONS OF 2020 IN JOURNALISTIC TEXTS. Ivan Franko National University of Lviv, marzo 2021. http://dx.doi.org/10.30970/vjo.2021.50.11107.
Testo completoOr, Etti, David Galbraith e Anne Fennell. Exploring mechanisms involved in grape bud dormancy: Large-scale analysis of expression reprogramming following controlled dormancy induction and dormancy release. United States Department of Agriculture, dicembre 2002. http://dx.doi.org/10.32747/2002.7587232.bard.
Testo completo