Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „Contextualized Language Models“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Contextualized Language Models" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Contextualized Language Models"
Myagmar, Batsergelen, Jie Li, and Shigetomo Kimura. "Cross-Domain Sentiment Classification With Bidirectional Contextualized Transformer Language Models." IEEE Access 7 (2019): 163219–30. http://dx.doi.org/10.1109/access.2019.2952360.
Der volle Inhalt der QuelleEl Adlouni, Yassine, Noureddine En Nahnahi, Said Ouatik El Alaoui, Mohammed Meknassi, Horacio Rodríguez, and Nabil Alami. "Arabic Biomedical Community Question Answering Based on Contextualized Embeddings." International Journal of Intelligent Information Technologies 17, no. 3 (2021): 13–29. http://dx.doi.org/10.4018/ijiit.2021070102.
Der volle Inhalt der QuelleZhou, Xuhui, Yue Zhang, Leyang Cui, and Dandan Huang. "Evaluating Commonsense in Pre-Trained Language Models." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 9733–40. http://dx.doi.org/10.1609/aaai.v34i05.6523.
Der volle Inhalt der QuelleYan, Huijiong, Tao Qian, Liang Xie, and Shanguang Chen. "Unsupervised cross-lingual model transfer for named entity recognition with contextualized word representations." PLOS ONE 16, no. 9 (2021): e0257230. http://dx.doi.org/10.1371/journal.pone.0257230.
Der volle Inhalt der QuelleSchumacher, Elliot, and Mark Dredze. "Learning unsupervised contextual representations for medical synonym discovery." JAMIA Open 2, no. 4 (2019): 538–46. http://dx.doi.org/10.1093/jamiaopen/ooz057.
Der volle Inhalt der QuelleSchick, Timo, and Hinrich Schütze. "Rare Words: A Major Problem for Contextualized Embeddings and How to Fix it by Attentive Mimicking." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 8766–74. http://dx.doi.org/10.1609/aaai.v34i05.6403.
Der volle Inhalt der QuelleStrokach, Alexey, Tian Yu Lu, and Philip M. Kim. "ELASPIC2 (EL2): Combining Contextualized Language Models and Graph Neural Networks to Predict Effects of Mutations." Journal of Molecular Biology 433, no. 11 (2021): 166810. http://dx.doi.org/10.1016/j.jmb.2021.166810.
Der volle Inhalt der QuelleDev, Sunipa, Tao Li, Jeff M. Phillips, and Vivek Srikumar. "On Measuring and Mitigating Biased Inferences of Word Embeddings." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 7659–66. http://dx.doi.org/10.1609/aaai.v34i05.6267.
Der volle Inhalt der QuelleGarí Soler, Aina, and Marianna Apidianaki. "Let’s Play Mono-Poly: BERT Can Reveal Words’ Polysemy Level and Partitionability into Senses." Transactions of the Association for Computational Linguistics 9 (2021): 825–44. http://dx.doi.org/10.1162/tacl_a_00400.
Der volle Inhalt der QuelleSaha, Koustuv, Ted Grover, Stephen M. Mattingly, et al. "Person-Centered Predictions of Psychological Constructs with Social Media Contextualized by Multimodal Sensing." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, no. 1 (2021): 1–32. http://dx.doi.org/10.1145/3448117.
Der volle Inhalt der QuelleDissertationen zum Thema "Contextualized Language Models"
Borggren, Lukas. "Automatic Categorization of News Articles With Contextualized Language Models." Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177004.
Der volle Inhalt der QuellePortnoff, Scott R. "(1) The case for using foreign language pedagogies in introductory computer programming instruction (2) A contextualized pre-AP computer programming curriculum| Models and simulations for exploring real-world cross-curricular topics." Thesis, California State University, Los Angeles, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10132126.
Der volle Inhalt der QuelleBücher zum Thema "Contextualized Language Models"
W, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 2nd ed. Heinle & Heinle, 2000.
Den vollen Inhalt der Quelle findenW, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 2nd ed. Heinle & Heinle, 2000.
Den vollen Inhalt der Quelle findenW, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 3rd ed. Thomson, Heinle, 2005.
Den vollen Inhalt der Quelle findenShrum, Judith L., and Eileen W. Glisan. Teacher's Handbook: Contextualized Language Instruction. 3rd ed. Heinle, 2004.
Den vollen Inhalt der Quelle findenShrum, Judith L., and Eileen W. Glisan. Teacher's Handbook: Contextualized Language Instruction. 2nd ed. Heinle & Heinle Publishers, 1999.
Den vollen Inhalt der Quelle findenBalbo, Andrea. Traces of Actio in Fragmentary Roman Orators. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198788201.003.0014.
Der volle Inhalt der QuelleMcNaughton, James. “It all boils down to a question of words”. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198822547.003.0006.
Der volle Inhalt der QuelleElior, Rachel. Jewish Mysticism. Translated by Arthur B. Millman. Liverpool University Press, 2007. http://dx.doi.org/10.3828/liverpool/9781874774679.001.0001.
Der volle Inhalt der QuelleJones, Chris. Fossil Poetry. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198824527.001.0001.
Der volle Inhalt der QuelleBuchteile zum Thema "Contextualized Language Models"
Straka, Milan, Jakub Náplava, Jana Straková, and David Samuel. "RobeCzech: Czech RoBERTa, a Monolingual Contextualized Language Representation Model." In Text, Speech, and Dialogue. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-83527-9_17.
Der volle Inhalt der QuelleSarhan, Injy, and Marco R. Spruit. "Contextualized Word Embeddings in a Neural Open Information Extraction Model." In Natural Language Processing and Information Systems. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-23281-8_31.
Der volle Inhalt der QuelleHarrison, S. J. "The Poetics of Fiction: Poetic Influence on the Language of Apuleius’ Metamorphoses." In Aspects of the Language of Latin Prose. British Academy, 2005. http://dx.doi.org/10.5871/bacad/9780197263327.003.0013.
Der volle Inhalt der QuelleHancı-Azizoglu, Eda Başak, and Nurdan Kavaklı. "Creative Digital Writing." In Digital Pedagogies and the Transformation of Language Education. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-6745-6.ch013.
Der volle Inhalt der QuelleMartinez, Martha I., Anya Hurwitz, Jennifer Analla, Laurie Olsen, and Joanna Meadvin. "Cultivating Rich Language Development, Deep Learning, and Joyful Classrooms for English Learners." In Handbook of Research on Engaging Immigrant Families and Promoting Academic Success for English Language Learners. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8283-0.ch014.
Der volle Inhalt der QuelleDavies, Joshua. "The language of gesture: Untimely bodies and contemporary performance." In Visions and ruins. Manchester University Press, 2018. http://dx.doi.org/10.7228/manchester/9781526125934.003.0005.
Der volle Inhalt der QuelleGreen, Sharon E., and Mason Gordon. "Teaching Literacy through Technology in the Middle School." In Academic Knowledge Construction and Multimodal Curriculum Development. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4797-8.ch014.
Der volle Inhalt der QuelleMartin, Alison E. "Styling Science." In Nature Translated. Edinburgh University Press, 2018. http://dx.doi.org/10.3366/edinburgh/9781474439329.003.0002.
Der volle Inhalt der QuelleBrowne, Craig, and Andrew P. Lynch. "Introduction." In Taylor and Politics. Edinburgh University Press, 2018. http://dx.doi.org/10.3366/edinburgh/9780748691937.003.0001.
Der volle Inhalt der QuellePerler, Dominik. "The Alienation Effect in the Historiography of Philosophy." In Philosophy and the Historical Perspective, edited by Marcel van Ackeren and Lee Klein. British Academy, 2018. http://dx.doi.org/10.5871/bacad/9780197266298.003.0009.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Contextualized Language Models"
Shaghaghian, Shohreh, Luna Yue Feng, Borna Jafarpour, and Nicolai Pogrebnyakov. "Customizing Contextualized Language Models for Legal Document Reviews." In 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2020. http://dx.doi.org/10.1109/bigdata50022.2020.9378201.
Der volle Inhalt der QuelleRoss, Hayley, Jonathon Cai, and Bonan Min. "Exploring Contextualized Neural Language Models for Temporal Dependency Parsing." In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.emnlp-main.689.
Der volle Inhalt der QuelleOkamoto, Takayoshi, Tetsuya Honda, and Koji Eguchi. "Locally contextualized smoothing of language models for sentiment sentence retrieval." In Proceeding of the 1st international CIKM workshop. ACM Press, 2009. http://dx.doi.org/10.1145/1651461.1651475.
Der volle Inhalt der QuelleHe, Bin, Di Zhou, Jinghui Xiao, et al. "BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models." In Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.findings-emnlp.207.
Der volle Inhalt der QuelleWang, Yixiao, Zied Bouraoui, Luis Espinosa Anke, and Steven Schockaert. "Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention Selection." In Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021). Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.repl4nlp-1.19.
Der volle Inhalt der QuelleGinzburg, Dvir, Itzik Malkiel, Oren Barkan, Avi Caciularu, and Noam Koenigstein. "Self-Supervised Document Similarity Ranking via Contextualized Language Models and Hierarchical Inference." In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.findings-acl.272.
Der volle Inhalt der QuelleMinh Le, Thao, Vuong Le, Svetha Venkatesh, and Truyen Tran. "Dynamic Language Binding in Relational Visual Reasoning." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/114.
Der volle Inhalt der QuelleQin, Libo, Minheng Ni, Yue Zhang, and Wanxiang Che. "CoSDA-ML: Multi-Lingual Code-Switching Data Augmentation for Zero-Shot Cross-Lingual NLP." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/533.
Der volle Inhalt der QuelleChen, Zhiyu, Mohamed Trabelsi, Jeff Heflin, Yinan Xu, and Brian D. Davison. "Table Search Using a Deep Contextualized Language Model." In SIGIR '20: The 43rd International ACM SIGIR conference on research and development in Information Retrieval. ACM, 2020. http://dx.doi.org/10.1145/3397271.3401044.
Der volle Inhalt der QuelleLiu, Liyuan, Xiang Ren, Jingbo Shang, Xiaotao Gu, Jian Peng, and Jiawei Han. "Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling." In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/d18-1153.
Der volle Inhalt der Quelle