Literatura científica selecionada sobre o tema "Protein language models"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Protein language models".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Protein language models"
Tang, Lin. "Protein language models using convolutions". Nature Methods 21, n.º 4 (abril de 2024): 550. http://dx.doi.org/10.1038/s41592-024-02252-3.
Texto completo da fonteAli, Sarwan, Prakash Chourasia e Murray Patterson. "When Protein Structure Embedding Meets Large Language Models". Genes 15, n.º 1 (23 de dezembro de 2023): 25. http://dx.doi.org/10.3390/genes15010025.
Texto completo da fonteFerruz, Noelia, e Birte Höcker. "Controllable protein design with language models". Nature Machine Intelligence 4, n.º 6 (junho de 2022): 521–32. http://dx.doi.org/10.1038/s42256-022-00499-z.
Texto completo da fonteLi, Xiang, Zhuoyu Wei, Yueran Hu e Xiaolei Zhu. "GraphNABP: Identifying nucleic acid-binding proteins with protein graphs and protein language models". International Journal of Biological Macromolecules 280 (novembro de 2024): 135599. http://dx.doi.org/10.1016/j.ijbiomac.2024.135599.
Texto completo da fonteSingh, Arunima. "Protein language models guide directed antibody evolution". Nature Methods 20, n.º 6 (junho de 2023): 785. http://dx.doi.org/10.1038/s41592-023-01924-w.
Texto completo da fonteTran, Chau, Siddharth Khadkikar e Aleksey Porollo. "Survey of Protein Sequence Embedding Models". International Journal of Molecular Sciences 24, n.º 4 (14 de fevereiro de 2023): 3775. http://dx.doi.org/10.3390/ijms24043775.
Texto completo da fontePokharel, Suresh, Pawel Pratyush, Hamid D. Ismail, Junfeng Ma e Dukka B. KC. "Integrating Embeddings from Multiple Protein Language Models to Improve Protein O-GlcNAc Site Prediction". International Journal of Molecular Sciences 24, n.º 21 (6 de novembro de 2023): 16000. http://dx.doi.org/10.3390/ijms242116000.
Texto completo da fonteWang, Wenkai, Zhenling Peng e Jianyi Yang. "Single-sequence protein structure prediction using supervised transformer protein language models". Nature Computational Science 2, n.º 12 (19 de dezembro de 2022): 804–14. http://dx.doi.org/10.1038/s43588-022-00373-3.
Texto completo da fontePang, Yihe, e Bin Liu. "IDP-LM: Prediction of protein intrinsic disorder and disorder functions based on language models". PLOS Computational Biology 19, n.º 11 (22 de novembro de 2023): e1011657. http://dx.doi.org/10.1371/journal.pcbi.1011657.
Texto completo da fonteWeber, Leon, Kirsten Thobe, Oscar Arturo Migueles Lozano, Jana Wolf e Ulf Leser. "PEDL: extracting protein–protein associations using deep language models and distant supervision". Bioinformatics 36, Supplement_1 (1 de julho de 2020): i490—i498. http://dx.doi.org/10.1093/bioinformatics/btaa430.
Texto completo da fonteTeses / dissertações sobre o assunto "Protein language models"
Meynard, Barthélémy. "Language Models towards Conditional Generative Modelsof Proteins Sequences". Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS195.
Texto completo da fonteThis thesis explores the intersection of artificial intelligence (AI) and biology, focusing on how generative models can innovate in protein sequence design. Our research unfolds in three distinct yet interconnected stages, each building upon the insights of the previous to enhance the model's applicability and performance in protein engineering.We begin by examining what makes a generative model effective for protein sequences. In our first study, "Interpretable Pairwise Distillations for Generative Protein Sequence Models," we compare complex neural network models to simpler, pairwise distribution models. This comparison highlights that deep learning strategy mainly model second order interaction, highlighting their fundamental role in modeling proteins family.In a second part, we try to expand this principle of using second order interaction to inverse folding. We explore structure conditioning in "Uncovering Sequence Diversity from a Known Protein Structure" Here, we present InvMSAFold, a method that produces diverse protein sequences designed to fold into a specific structure. This approach tries to combines two different tradition of proteins modeling: the MSA based models that try to capture the entire fitness landscape and the inverse folding types of model that focus on recovering one specific sequence. This is a first step towards the possibility of conditioning the fitness landscape by considering the protein's final structure in the design process, enabling the generation of sequences that are not only diverse but also maintain their intended structural integrity. Finally, we delve into sequence conditioning with "Generating Interacting Protein Sequences using Domain-to-Domain Translation." This study introduces a novel approach to generate protein sequences that can interact with specific other proteins. By treating this as a translation problem, similar to methods used in language processing, we create sequences with intended functionalities. Furthermore, we address the critical challenge of T-cell receptor (TCR) and epitope interaction prediction in "TULIP—a Transformer based Unsupervised Language model for Interacting Peptides and T-cell receptors." This study introduces an unsupervised learning approach to accurately predict TCR-epitope bindings, overcoming limitations in data quality and training bias inherent in previous models. These advancements underline the potential of sequence conditioning in creating functionally specific and interaction-aware protein designs
Hladiš, Matej. "Réseaux de neurones en graphes et modèle de langage des protéines pour révéler le code combinatoire de l'olfaction". Electronic Thesis or Diss., Université Côte d'Azur, 2024. http://www.theses.fr/2024COAZ5024.
Texto completo da fonteMammals identify and interpret a myriad of olfactory stimuli using a complex coding mechanism involving interactions between odorant molecules and hundreds of olfactory receptors (ORs). These interactions generate unique combinations of activated receptors, called the combinatorial code, which the human brain interprets as the sensation we call smell. Until now, the vast number of possible receptor-molecule combinations have prevented a large-scale experimental study of this code and its link to odor perception. Therefore, revealing this code is crucial to answering the long-term question of how we perceive our intricate chemical environment. ORs belong to the class A of G protein-coupled receptors (GPCRs) and constitute the largest known multigene family. To systematically study olfactory coding, we develop M2OR, a comprehensive database compiling the last 25 years of OR bioassays. Using this dataset, a tailored deep learning model is designed and trained. It combines the [CLS] token embedding from a protein language model with graph neural networks and multi-head attention. This model predicts the activation of ORs by odorants and reveals the resulting combinatorial code for any odorous molecule. This approach is refined by developing a novel model capable of predicting the activity of an odorant at a specific concentration, subsequently allowing the estimation of the EC50 value for any OR-odorant pair. Finally, the combinatorial codes derived from both models are used to predict the odor perception of molecules. By incorporating inductive biases inspired by olfactory coding theory, a machine learning model based on these codes outperforms the current state-of-the-art in smell prediction. To the best of our knowledge, this is the most comprehensive and successful application of combinatorial coding to odor quality prediction. Overall, this work provides a link between the complex molecule-receptor interactions and human perception
Livros sobre o assunto "Protein language models"
Bovington, Sue. Tigris/Thames. [United Kingdom]: [Sue Bovington], 2011.
Encontre o texto completo da fonteBeatriz, Solís Leree, ed. La Ley televisa y la lucha por el poder en México. México, D.F: Universidad Autónoma Metropolitana, Unidad Xochimilco, 2009.
Encontre o texto completo da fonteYoshikawa, Saeko. William Wordsworth and Modern Travel. Liverpool University Press, 2020. http://dx.doi.org/10.3828/liverpool/9781789621181.001.0001.
Texto completo da fonteHardiman, David. The Nonviolent Struggle for Indian Freedom, 1905-19. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190920678.001.0001.
Texto completo da fonteMcNally, Michael D. Defend the Sacred. Princeton University Press, 2020. http://dx.doi.org/10.23943/princeton/9780691190907.001.0001.
Texto completo da fonteMeddings, Jennifer, Vineet Chopra e Sanjay Saint. Preventing Hospital Infections. 2a ed. Oxford University Press, 2021. http://dx.doi.org/10.1093/med/9780197509159.001.0001.
Texto completo da fonteHalvorsen, Tar, e Peter Vale. One World, Many Knowledges: Regional experiences and cross-regional links in higher education. African Minds, 2016. http://dx.doi.org/10.47622/978-0-620-55789-4.
Texto completo da fonteCapítulos de livros sobre o assunto "Protein language models"
Xu, Yaoyao, Xinjian Zhao, Xiaozhuang Song, Benyou Wang e Tianshu Yu. "Boosting Protein Language Models with Negative Sample Mining". In Lecture Notes in Computer Science, 199–214. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-70381-2_13.
Texto completo da fonteZhao, Junming, Chao Zhang e Yunan Luo. "Contrastive Fitness Learning: Reprogramming Protein Language Models for Low-N Learning of Protein Fitness Landscape". In Lecture Notes in Computer Science, 470–74. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-1-0716-3989-4_55.
Texto completo da fonteGhazikhani, Hamed, e Gregory Butler. "A Study on the Application of Protein Language Models in the Analysis of Membrane Proteins". In Distributed Computing and Artificial Intelligence, Special Sessions, 19th International Conference, 147–52. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-23210-7_14.
Texto completo da fonteZeng, Shuai, Duolin Wang, Lei Jiang e Dong Xu. "Prompt-Based Learning on Large Protein Language Models Improves Signal Peptide Prediction". In Lecture Notes in Computer Science, 400–405. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-1-0716-3989-4_40.
Texto completo da fonteFernández, Diego, Álvaro Olivera-Nappa, Roberto Uribe-Paredes e David Medina-Ortiz. "Exploring Machine Learning Algorithms and Protein Language Models Strategies to Develop Enzyme Classification Systems". In Bioinformatics and Biomedical Engineering, 307–19. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-34953-9_24.
Texto completo da fontePaaß, Gerhard, e Sven Giesselbach. "Foundation Models for Speech, Images, Videos, and Control". In Artificial Intelligence: Foundations, Theory, and Algorithms, 313–82. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-23190-2_7.
Texto completo da fonteShan, Kaixuan, Xiankun Zhang e Chen Song. "Prediction of Protein-DNA Binding Sites Based on Protein Language Model and Deep Learning". In Advanced Intelligent Computing in Bioinformatics, 314–25. Singapore: Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-5692-6_28.
Texto completo da fonteMatsiunova, Antonina. "Semantic opposition of US versus THEM in late 2020 Russian-language Belarusian discourse". In Protest in Late Modern Societies, 42–55. London: Routledge, 2023. http://dx.doi.org/10.4324/9781003270065-4.
Texto completo da fonteMullett, Michael. "Language and Action in Peasant Revolts". In Popular Culture and Popular Protest in Late Medieval and Early Modern Europe, 71–109. London: Routledge, 2021. http://dx.doi.org/10.4324/9781003188858-3.
Texto completo da fonteFan, Jianye, Xiaofeng Liu, Shoubin Dong e Jinlong Hu. "Enriching Pre-trained Language Model with Dependency Syntactic Information for Chemical-Protein Interaction Extraction". In Lecture Notes in Computer Science, 58–69. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-56725-5_5.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Protein language models"
Jiang, Yanfeng, Ning Sun, Zhengxian Lu, Shuang Peng, Yi Zhang, Fei Yang e Tao Li. "MEFold: Memory-Efficient Optimization for Protein Language Models via Chunk and Quantization". In 2024 International Joint Conference on Neural Networks (IJCNN), 1–8. IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10651470.
Texto completo da fonteKim, Yunsoo. "Foundation Model for Biomedical Graphs: Integrating Knowledge Graphs and Protein Structures to Large Language Models". In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), 346–55. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.acl-srw.30.
Texto completo da fonteEngel, Ryan, e Gilchan Park. "Evaluating Large Language Models for Predicting Protein Behavior under Radiation Exposure and Disease Conditions". In Proceedings of the 23rd Workshop on Biomedical Natural Language Processing, 427–39. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.bionlp-1.34.
Texto completo da fontelin, ming. "DPPred-indel:a pathogenic inframe indel prediction method based on biological language models and fusion of DNA and protein features". In 2024 Fourth International Conference on Biomedicine and Bioinformatics Engineering (ICBBE 2024), editado por Pier Paolo Piccaluga, Ahmed El-Hashash e Xiangqian Guo, 67. SPIE, 2024. http://dx.doi.org/10.1117/12.3044406.
Texto completo da fonteZeinalipour, Kamyar, Neda Jamshidi, Monica Bianchini, Marco Maggini e Marco Gori. "Design Proteins Using Large Language Models: Enhancements and Comparative Analyses". In Proceedings of the 1st Workshop on Language + Molecules (L+M 2024), 34–47. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.langmol-1.5.
Texto completo da fontePeng, Shuang, Fei Yang, Ning Sun, Sheng Chen, Yanfeng Jiang e Aimin Pan. "Exploring Post-Training Quantization of Protein Language Models". In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385775.
Texto completo da fonteGao, Liyuan, Kyler Shu, Jun Zhang e Victor S. Sheng. "Explainable Transcription Factor Prediction with Protein Language Models". In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385498.
Texto completo da fonteŠkrhák, Vít, Kamila Riedlova, Marian Novotný e David Hoksza. "Cryptic binding site prediction with protein language models". In 2023 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2023. http://dx.doi.org/10.1109/bibm58861.2023.10385497.
Texto completo da fonteWu, Xinbo, Alexandru Hanganu, Ayuko Hoshino e Lav R. Varshney. "Source Identification for Exosomal Communication via Protein Language Models". In 2022 IEEE 32nd International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2022. http://dx.doi.org/10.1109/mlsp55214.2022.9943418.
Texto completo da fonteIinuma, Naoki, Makoto Miwa e Yutaka Sasaki. "Improving Supervised Drug-Protein Relation Extraction with Distantly Supervised Models". In Proceedings of the 21st Workshop on Biomedical Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.bionlp-1.16.
Texto completo da fonteRelatórios de organizações sobre o assunto "Protein language models"
Wu, Jyun-Jie. Improving Predictive Efficiency and Literature Quality Assessment for Lung Cancer Complications Post-Proton Therapy Through Large Language Models and Meta-Analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, agosto de 2024. http://dx.doi.org/10.37766/inplasy2024.8.0103.
Texto completo da fonteShani, Uri, Lynn Dudley, Alon Ben-Gal, Menachem Moshelion e Yajun Wu. Root Conductance, Root-soil Interface Water Potential, Water and Ion Channel Function, and Tissue Expression Profile as Affected by Environmental Conditions. United States Department of Agriculture, outubro de 2007. http://dx.doi.org/10.32747/2007.7592119.bard.
Texto completo da fonteMelnyk, Iurii. JUSTIFICATION OF OCCUPATION IN GERMAN (1938) AND RUSSIAN (2014) MEDIA: SUBSTITUTION OF AGGRESSOR AND VICTIM. Ivan Franko National University of Lviv, março de 2021. http://dx.doi.org/10.30970/vjo.2021.50.11101.
Texto completo da fonteYatsymirska, Mariya. KEY IMPRESSIONS OF 2020 IN JOURNALISTIC TEXTS. Ivan Franko National University of Lviv, março de 2021. http://dx.doi.org/10.30970/vjo.2021.50.11107.
Texto completo da fonteOr, Etti, David Galbraith e Anne Fennell. Exploring mechanisms involved in grape bud dormancy: Large-scale analysis of expression reprogramming following controlled dormancy induction and dormancy release. United States Department of Agriculture, dezembro de 2002. http://dx.doi.org/10.32747/2002.7587232.bard.
Texto completo da fonte