Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „Contextualized Language Models“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Contextualized Language Models" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Contextualized Language Models"
El Adlouni, Yassine, Noureddine En Nahnahi, Said Ouatik El Alaoui, Mohammed Meknassi, Horacio Rodríguez, and Nabil Alami. "Arabic Biomedical Community Question Answering Based on Contextualized Embeddings." International Journal of Intelligent Information Technologies 17, no. 3 (2021): 13–29. http://dx.doi.org/10.4018/ijiit.2021070102.
Der volle Inhalt der QuelleZhou, Xuhui, Yue Zhang, Leyang Cui, and Dandan Huang. "Evaluating Commonsense in Pre-Trained Language Models." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 9733–40. http://dx.doi.org/10.1609/aaai.v34i05.6523.
Der volle Inhalt der QuelleMyagmar, Batsergelen, Jie Li, and Shigetomo Kimura. "Cross-Domain Sentiment Classification With Bidirectional Contextualized Transformer Language Models." IEEE Access 7 (2019): 163219–30. http://dx.doi.org/10.1109/access.2019.2952360.
Der volle Inhalt der QuelleLi, Yichen, Yintong Huo, Renyi Zhong, et al. "Go Static: Contextualized Logging Statement Generation." Proceedings of the ACM on Software Engineering 1, FSE (2024): 609–30. http://dx.doi.org/10.1145/3643754.
Der volle Inhalt der QuelleYan, Huijiong, Tao Qian, Liang Xie, and Shanguang Chen. "Unsupervised cross-lingual model transfer for named entity recognition with contextualized word representations." PLOS ONE 16, no. 9 (2021): e0257230. http://dx.doi.org/10.1371/journal.pone.0257230.
Der volle Inhalt der QuelleXu, Yifei, Jingqiao Zhang, Ru He, et al. "SAS: Self-Augmentation Strategy for Language Model Pre-training." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 10 (2022): 11586–94. http://dx.doi.org/10.1609/aaai.v36i10.21412.
Der volle Inhalt der QuelleCong, Yan. "AI Language Models: An Opportunity to Enhance Language Learning." Informatics 11, no. 3 (2024): 49. http://dx.doi.org/10.3390/informatics11030049.
Der volle Inhalt der QuelleZhang, Shuiliang, Hai Zhao, Junru Zhou, Xi Zhou, and Xiang Zhou. "Semantics-Aware Inferential Network for Natural Language Understanding." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 16 (2021): 14437–45. http://dx.doi.org/10.1609/aaai.v35i16.17697.
Der volle Inhalt der QuelleSchumacher, Elliot, and Mark Dredze. "Learning unsupervised contextual representations for medical synonym discovery." JAMIA Open 2, no. 4 (2019): 538–46. http://dx.doi.org/10.1093/jamiaopen/ooz057.
Der volle Inhalt der QuelleZhang, Yuhan, Wenqi Chen, Ruihan Zhang, and Xiajie Zhang. "Representing affect information in word embeddings." Experiments in Linguistic Meaning 2 (January 27, 2023): 310. http://dx.doi.org/10.3765/elm.2.5391.
Der volle Inhalt der QuelleDissertationen zum Thema "Contextualized Language Models"
Borggren, Lukas. "Automatic Categorization of News Articles With Contextualized Language Models." Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177004.
Der volle Inhalt der QuellePortnoff, Scott R. "(1) The case for using foreign language pedagogies in introductory computer programming instruction (2) A contextualized pre-AP computer programming curriculum| Models and simulations for exploring real-world cross-curricular topics." Thesis, California State University, Los Angeles, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10132126.
Der volle Inhalt der QuelleBücher zum Thema "Contextualized Language Models"
W, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 2nd ed. Heinle & Heinle, 2000.
Den vollen Inhalt der Quelle findenW, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 3rd ed. Thomson, Heinle, 2005.
Den vollen Inhalt der Quelle findenW, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 2nd ed. Heinle & Heinle, 2000.
Den vollen Inhalt der Quelle findenShrum, Judith L., and Eileen W. Glisan. Teacher's Handbook: Contextualized Language Instruction. 2nd ed. Heinle & Heinle Publishers, 1999.
Den vollen Inhalt der Quelle findenShrum, Judith L., and Eileen W. Glisan. Teacher's Handbook: Contextualized Language Instruction. 3rd ed. Heinle, 2004.
Den vollen Inhalt der Quelle findenNelson, Eric S. Heidegger and Dao. Bloomsbury Publishing Plc, 2023. http://dx.doi.org/10.5040/9781350411937.
Der volle Inhalt der QuelleBalbo, Andrea. Traces of Actio in Fragmentary Roman Orators. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198788201.003.0014.
Der volle Inhalt der QuelleMcNaughton, James. “It all boils down to a question of words”. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198822547.003.0006.
Der volle Inhalt der QuelleBurke, Tony, and Brent Landau, eds. New Testament Apocrypha. Wm. B. Eerdmans Publishing Co., 2016. https://doi.org/10.5040/bci-0kbz.
Der volle Inhalt der QuelleWalker, Katherine. Shakespeare and Science. Bloomsbury Publishing Plc, 2022. http://dx.doi.org/10.5040/9781350044654.
Der volle Inhalt der QuelleBuchteile zum Thema "Contextualized Language Models"
Gao, Ya, Shaoxiong Ji, Tongxuan Zhang, Prayag Tiwari, and Pekka Marttinen. "Contextualized Graph Embeddings for Adverse Drug Event Detection." In Machine Learning and Knowledge Discovery in Databases. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-26390-3_35.
Der volle Inhalt der QuelleLi, Ruijia, Yiting Wang, Chanjin Zheng, Yuan-Hao Jiang, and Bo Jiang. "Generating Contextualized Mathematics Multiple-Choice Questions Utilizing Large Language Models." In Artificial Intelligence in Education. Posters and Late Breaking Results, Workshops and Tutorials, Industry and Innovation Tracks, Practitioners, Doctoral Consortium and Blue Sky. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-64315-6_48.
Der volle Inhalt der QuelleSantos, Joaquim, Henrique D. P. dos Santos, Fábio Tabalipa, and Renata Vieira. "De-Identification of Clinical Notes Using Contextualized Language Models and a Token Classifier." In Intelligent Systems. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-91699-2_3.
Der volle Inhalt der QuelleStraka, Milan, Jakub Náplava, Jana Straková, and David Samuel. "RobeCzech: Czech RoBERTa, a Monolingual Contextualized Language Representation Model." In Text, Speech, and Dialogue. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-83527-9_17.
Der volle Inhalt der QuelleSarhan, Injy, and Marco R. Spruit. "Contextualized Word Embeddings in a Neural Open Information Extraction Model." In Natural Language Processing and Information Systems. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-23281-8_31.
Der volle Inhalt der QuelleYan, Faren, Peng Yu, and Xin Chen. "LTNER: Large Language Model Tagging for Named Entity Recognition with Contextualized Entity Marking." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2024. https://doi.org/10.1007/978-3-031-78495-8_25.
Der volle Inhalt der QuelleCiula, Arianna, Øyvind Eide, Cristina Marras, and Patrick Sahle. "Model and Modelling in Digital Humanities." In Modelling Between Digital and Humanities. Open Book Publishers, 2023. http://dx.doi.org/10.11647/obp.0369.01.
Der volle Inhalt der QuelleClossey, Luke. "10. Making Canon." In Jesus and the Making of the Modern Mind, 1380-1520. Open Book Publishers, 2024. http://dx.doi.org/10.11647/obp.0371.10.
Der volle Inhalt der QuelleTran, Phuc, and Marina Tropmann-Frick. "Global Contextualized Representations: Enhancing Machine Reading Comprehension with Graph Neural Networks." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2025. https://doi.org/10.3233/faia241571.
Der volle Inhalt der QuelleToddenroth, Dennis. "Classifiers of Medical Eponymy in Scientific Texts." In Caring is Sharing – Exploiting the Value in Data for Health and Innovation. IOS Press, 2023. http://dx.doi.org/10.3233/shti230271.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Contextualized Language Models"
Liétard, Bastien, Pascal Denis, and Mikaela Keller. "To Word Senses and Beyond: Inducing Concepts with Contextualized Language Models." In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.emnlp-main.156.
Der volle Inhalt der QuellePark, Jun-Hyung, Mingyu Lee, Junho Kim, and SangKeun Lee. "Coconut: Contextualized Commonsense Unified Transformers for Graph-Based Commonsense Augmentation of Language Models." In Findings of the Association for Computational Linguistics ACL 2024. Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-acl.346.
Der volle Inhalt der QuelleThapa, Maya, Puneet Kapoor, Sakshi Kaushal, and Ishani Sharma. "A Review of Contextualized Word Embeddings and Pre-Trained Language Models, with a Focus on GPT and BERT." In International Conference on Cognitive & Cloud Computing. SCITEPRESS - Science and Technology Publications, 2024. https://doi.org/10.5220/0013305900004646.
Der volle Inhalt der QuelleMudgal, Ananya, Anshul Sharma, and Yugnanda Malhotra. "Bridging the Gap: Towards Contextualised Optical Character Recognition using Large Language Models." In Frontiers in Optics. Optica Publishing Group, 2024. https://doi.org/10.1364/fio.2024.jd4a.106.
Der volle Inhalt der QuelleShaghaghian, Shohreh, Luna Yue Feng, Borna Jafarpour, and Nicolai Pogrebnyakov. "Customizing Contextualized Language Models for Legal Document Reviews." In 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2020. http://dx.doi.org/10.1109/bigdata50022.2020.9378201.
Der volle Inhalt der QuelleYagi, Sane Mo, Youssef Mansour, Firuz Kamalov, and Ashraf Elnagar. "Evaluation of Arabic-Based Contextualized Word Embedding Models." In 2021 International Conference on Asian Language Processing (IALP). IEEE, 2021. http://dx.doi.org/10.1109/ialp54817.2021.9675208.
Der volle Inhalt der QuelleMansour, Youssef, and Ashraf Elnagar. "Sarcasm Detection in Arabic Text Using Contextualized Models." In 2023 International Conference on Asian Language Processing (IALP). IEEE, 2023. http://dx.doi.org/10.1109/ialp61005.2023.10337175.
Der volle Inhalt der QuelleRoss, Hayley, Jonathon Cai, and Bonan Min. "Exploring Contextualized Neural Language Models for Temporal Dependency Parsing." In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.emnlp-main.689.
Der volle Inhalt der QuelleOkamoto, Takayoshi, Tetsuya Honda, and Koji Eguchi. "Locally contextualized smoothing of language models for sentiment sentence retrieval." In Proceeding of the 1st international CIKM workshop. ACM Press, 2009. http://dx.doi.org/10.1145/1651461.1651475.
Der volle Inhalt der QuelleHe, Bin, Di Zhou, Jinghui Xiao, et al. "BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models." In Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.findings-emnlp.207.
Der volle Inhalt der Quelle