Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „Contextualized Language Models“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Contextualized Language Models" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Contextualized Language Models"
Myagmar, Batsergelen, Jie Li und Shigetomo Kimura. „Cross-Domain Sentiment Classification With Bidirectional Contextualized Transformer Language Models“. IEEE Access 7 (2019): 163219–30. http://dx.doi.org/10.1109/access.2019.2952360.
Der volle Inhalt der QuelleEl Adlouni, Yassine, Noureddine En Nahnahi, Said Ouatik El Alaoui, Mohammed Meknassi, Horacio Rodríguez und Nabil Alami. „Arabic Biomedical Community Question Answering Based on Contextualized Embeddings“. International Journal of Intelligent Information Technologies 17, Nr. 3 (Juli 2021): 13–29. http://dx.doi.org/10.4018/ijiit.2021070102.
Der volle Inhalt der QuelleZhou, Xuhui, Yue Zhang, Leyang Cui und Dandan Huang. „Evaluating Commonsense in Pre-Trained Language Models“. Proceedings of the AAAI Conference on Artificial Intelligence 34, Nr. 05 (03.04.2020): 9733–40. http://dx.doi.org/10.1609/aaai.v34i05.6523.
Der volle Inhalt der QuelleYan, Huijiong, Tao Qian, Liang Xie und Shanguang Chen. „Unsupervised cross-lingual model transfer for named entity recognition with contextualized word representations“. PLOS ONE 16, Nr. 9 (21.09.2021): e0257230. http://dx.doi.org/10.1371/journal.pone.0257230.
Der volle Inhalt der QuelleSchumacher, Elliot, und Mark Dredze. „Learning unsupervised contextual representations for medical synonym discovery“. JAMIA Open 2, Nr. 4 (04.11.2019): 538–46. http://dx.doi.org/10.1093/jamiaopen/ooz057.
Der volle Inhalt der QuelleSchick, Timo, und Hinrich Schütze. „Rare Words: A Major Problem for Contextualized Embeddings and How to Fix it by Attentive Mimicking“. Proceedings of the AAAI Conference on Artificial Intelligence 34, Nr. 05 (03.04.2020): 8766–74. http://dx.doi.org/10.1609/aaai.v34i05.6403.
Der volle Inhalt der QuelleStrokach, Alexey, Tian Yu Lu und Philip M. Kim. „ELASPIC2 (EL2): Combining Contextualized Language Models and Graph Neural Networks to Predict Effects of Mutations“. Journal of Molecular Biology 433, Nr. 11 (Mai 2021): 166810. http://dx.doi.org/10.1016/j.jmb.2021.166810.
Der volle Inhalt der QuelleDev, Sunipa, Tao Li, Jeff M. Phillips und Vivek Srikumar. „On Measuring and Mitigating Biased Inferences of Word Embeddings“. Proceedings of the AAAI Conference on Artificial Intelligence 34, Nr. 05 (03.04.2020): 7659–66. http://dx.doi.org/10.1609/aaai.v34i05.6267.
Der volle Inhalt der QuelleGarí Soler, Aina, und Marianna Apidianaki. „Let’s Play Mono-Poly: BERT Can Reveal Words’ Polysemy Level and Partitionability into Senses“. Transactions of the Association for Computational Linguistics 9 (2021): 825–44. http://dx.doi.org/10.1162/tacl_a_00400.
Der volle Inhalt der QuelleSaha, Koustuv, Ted Grover, Stephen M. Mattingly, Vedant Das swain, Pranshu Gupta, Gonzalo J. Martinez, Pablo Robles-Granda, Gloria Mark, Aaron Striegel und Munmun De Choudhury. „Person-Centered Predictions of Psychological Constructs with Social Media Contextualized by Multimodal Sensing“. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, Nr. 1 (19.03.2021): 1–32. http://dx.doi.org/10.1145/3448117.
Der volle Inhalt der QuelleDissertationen zum Thema "Contextualized Language Models"
Borggren, Lukas. „Automatic Categorization of News Articles With Contextualized Language Models“. Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177004.
Der volle Inhalt der QuellePortnoff, Scott R. „(1) The case for using foreign language pedagogies in introductory computer programming instruction (2) A contextualized pre-AP computer programming curriculum| Models and simulations for exploring real-world cross-curricular topics“. Thesis, California State University, Los Angeles, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10132126.
Der volle Inhalt der QuelleLarge numbers of novice programmers have been failing postsecondary introductory computer science programming (CS1) courses for nearly four decades. Student learning is much worse in secondary programming courses of similar or even lesser rigor. This has critical implications for efforts to reclassify Computer Science (CS) as a core secondary subject. State departments of education have little incentive to do so until it can be demonstrated that most grade-level students will not only pass such classes, but will be well-prepared to succeed in subsequent vertically aligned coursework.
One rarely considered cause for such massive failure is insufficient pedagogic attention to teaching a programming language (PL) as a language, per se. Students who struggle with acquiring proficiency in using a PL can be likened to students who flounder in a French class due to a poor grasp of the language's syntactic or semantic features. Though natural languages (NL) and PLs differ in many key respects, a recently reported (2014) fMRI study has demonstrated that comprehension of computer programs primarily utilizes regions of the brain involved in language processing, not math. The implications for CS pedagogy are that, if PLs are learned in ways fundamentally similar to how second languages (L2) are acquired, foreign language pedagogies (FLP) and second language acquisition (SLA) theories can be key sources for informing the crafting of effective PL teaching strategies.
In this regard, key features of contemporary L2 pedagogies relevant to effective PL instruction—reflecting the late 20th-century shift in emphasis from cognitive learning that stressed grammatical knowledge, to one that facilitates communication and practical uses of the language—are: (1) repetitive and comprehensible input in a variety of contexts, and (2) motivated, meaningful communication and interaction.
Informed by these principles, four language-based strategies adapted for PL instruction are described, the first to help students acquire syntax and three others for learning semantics: (a) memorization; (b) setting components in relief; (c) transformations; and (d) ongoing exposure.
Anecdotal observations in my classroom have long indicated that memorization of small programs and program fragments can immediately and drastically reduce the occurrence of syntax errors among novice pre-AP Java programming students. A modest first experiment attempting to confirm the effect was statistically unconvincing: for students most likely to struggle, the Pearson coefficient of −0.474 (p < 0.064) suggested a low-modest inverse correlation. A follow-up study will be better designed. Still, a possible explanation for the anecdotal phenomenon is that the repetition required for proficient memorization activates the same subconscious language acquisition processes that construct NL grammars when learners are exposed to language data.
Dismal retention rates subsequent to the introductory programming course have historically also been a persistent problem. One key factor impacting attrition is a student's intrinsic motivation, which is shaped both by interest in, and self-efficacy with regards to, the subject matter. Interest involves not just CS concepts, but also context, the domains used to illustrate how one can apply those concepts. One way to tap into a wide range of student interests is to demonstrate the capacity of CS to explore, model, simulate and solve non-trivial problems in domains across the academic spectrum, fields that students already value and whose basic concepts they already understand.
An original University of California "G" elective (UCOP "a-g" approved) pre-AP programming course along these principles is described. In this graphics-based Processing course, students are guided through the process of writing and studying small dynamic art programs, progressing to mid-size graphics programs that model or simulate real-world problems and phenomena in geography, biology, political science and astronomy. The contextualized course content combined with the language-specific strategies outlined above address both interest and self-efficacy. Although anecdotally these appear to have a positive effect on student understanding and retention, studies need to be done on a larger scale to validate these outcomes.
Finally, a critique is offered of the movement to replace rigorous secondary programming instruction with survey courses—particularly Exploring Computer Science and APCS Principles—under the guise of "democratizing" secondary CS education or to address the severe and persistent demographic disparities. This group of educators has promulgated a nonsensical fiction that programming is simply one of many subdisciplines of the field, rather than the core skill needed to understand all other CS topics in any deep and meaningful way. These courses present a facade of mitigating demographic disparities, but leave participants no better prepared for subsequent CS study.
Bücher zum Thema "Contextualized Language Models"
W, Glisan Eileen, Hrsg. Teacher's handbook: Contextualized language instruction. 2. Aufl. Boston, Mass: Heinle & Heinle, 2000.
Den vollen Inhalt der Quelle findenW, Glisan Eileen, Hrsg. Teacher's handbook: Contextualized language instruction. 2. Aufl. Boston, Mass: Heinle & Heinle, 2000.
Den vollen Inhalt der Quelle findenW, Glisan Eileen, Hrsg. Teacher's handbook: Contextualized language instruction. 3. Aufl. Southbank, Victoria, Australia: Thomson, Heinle, 2005.
Den vollen Inhalt der Quelle findenShrum, Judith L., und Eileen W. Glisan. Teacher's Handbook: Contextualized Language Instruction. 3. Aufl. Heinle, 2004.
Den vollen Inhalt der Quelle findenShrum, Judith L., und Eileen W. Glisan. Teacher's Handbook: Contextualized Language Instruction. 2. Aufl. Heinle & Heinle Publishers, 1999.
Den vollen Inhalt der Quelle findenBalbo, Andrea. Traces of Actio in Fragmentary Roman Orators. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198788201.003.0014.
Der volle Inhalt der QuelleMcNaughton, James. “It all boils down to a question of words”. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198822547.003.0006.
Der volle Inhalt der QuelleElior, Rachel. Jewish Mysticism. Übersetzt von Arthur B. Millman. Liverpool University Press, 2007. http://dx.doi.org/10.3828/liverpool/9781874774679.001.0001.
Der volle Inhalt der QuelleJones, Chris. Fossil Poetry. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198824527.001.0001.
Der volle Inhalt der QuelleBuchteile zum Thema "Contextualized Language Models"
Straka, Milan, Jakub Náplava, Jana Straková und David Samuel. „RobeCzech: Czech RoBERTa, a Monolingual Contextualized Language Representation Model“. In Text, Speech, and Dialogue, 197–209. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-83527-9_17.
Der volle Inhalt der QuelleSarhan, Injy, und Marco R. Spruit. „Contextualized Word Embeddings in a Neural Open Information Extraction Model“. In Natural Language Processing and Information Systems, 359–67. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-23281-8_31.
Der volle Inhalt der QuelleHarrison, S. J. „The Poetics of Fiction: Poetic Influence on the Language of Apuleius’ Metamorphoses“. In Aspects of the Language of Latin Prose. British Academy, 2005. http://dx.doi.org/10.5871/bacad/9780197263327.003.0013.
Der volle Inhalt der QuelleHancı-Azizoglu, Eda Başak, und Nurdan Kavaklı. „Creative Digital Writing“. In Digital Pedagogies and the Transformation of Language Education, 250–66. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-6745-6.ch013.
Der volle Inhalt der QuelleMartinez, Martha I., Anya Hurwitz, Jennifer Analla, Laurie Olsen und Joanna Meadvin. „Cultivating Rich Language Development, Deep Learning, and Joyful Classrooms for English Learners“. In Handbook of Research on Engaging Immigrant Families and Promoting Academic Success for English Language Learners, 269–93. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8283-0.ch014.
Der volle Inhalt der QuelleDavies, Joshua. „The language of gesture: Untimely bodies and contemporary performance“. In Visions and ruins. Manchester University Press, 2018. http://dx.doi.org/10.7228/manchester/9781526125934.003.0005.
Der volle Inhalt der QuelleGreen, Sharon E., und Mason Gordon. „Teaching Literacy through Technology in the Middle School“. In Academic Knowledge Construction and Multimodal Curriculum Development, 230–42. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4797-8.ch014.
Der volle Inhalt der QuelleMartin, Alison E. „Styling Science“. In Nature Translated, 22–39. Edinburgh University Press, 2018. http://dx.doi.org/10.3366/edinburgh/9781474439329.003.0002.
Der volle Inhalt der QuelleBrowne, Craig, und Andrew P. Lynch. „Introduction“. In Taylor and Politics, 1–16. Edinburgh University Press, 2018. http://dx.doi.org/10.3366/edinburgh/9780748691937.003.0001.
Der volle Inhalt der QuellePerler, Dominik. „The Alienation Effect in the Historiography of Philosophy“. In Philosophy and the Historical Perspective, herausgegeben von Marcel van Ackeren und Lee Klein, 140–54. British Academy, 2018. http://dx.doi.org/10.5871/bacad/9780197266298.003.0009.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Contextualized Language Models"
Shaghaghian, Shohreh, Luna Yue Feng, Borna Jafarpour und Nicolai Pogrebnyakov. „Customizing Contextualized Language Models for Legal Document Reviews“. In 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2020. http://dx.doi.org/10.1109/bigdata50022.2020.9378201.
Der volle Inhalt der QuelleRoss, Hayley, Jonathon Cai und Bonan Min. „Exploring Contextualized Neural Language Models for Temporal Dependency Parsing“. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.emnlp-main.689.
Der volle Inhalt der QuelleOkamoto, Takayoshi, Tetsuya Honda und Koji Eguchi. „Locally contextualized smoothing of language models for sentiment sentence retrieval“. In Proceeding of the 1st international CIKM workshop. New York, New York, USA: ACM Press, 2009. http://dx.doi.org/10.1145/1651461.1651475.
Der volle Inhalt der QuelleHe, Bin, Di Zhou, Jinghui Xiao, Xin Jiang, Qun Liu, Nicholas Jing Yuan und Tong Xu. „BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models“. In Findings of the Association for Computational Linguistics: EMNLP 2020. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.findings-emnlp.207.
Der volle Inhalt der QuelleWang, Yixiao, Zied Bouraoui, Luis Espinosa Anke und Steven Schockaert. „Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention Selection“. In Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021). Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.repl4nlp-1.19.
Der volle Inhalt der QuelleGinzburg, Dvir, Itzik Malkiel, Oren Barkan, Avi Caciularu und Noam Koenigstein. „Self-Supervised Document Similarity Ranking via Contextualized Language Models and Hierarchical Inference“. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.findings-acl.272.
Der volle Inhalt der QuelleMinh Le, Thao, Vuong Le, Svetha Venkatesh und Truyen Tran. „Dynamic Language Binding in Relational Visual Reasoning“. In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/114.
Der volle Inhalt der QuelleQin, Libo, Minheng Ni, Yue Zhang und Wanxiang Che. „CoSDA-ML: Multi-Lingual Code-Switching Data Augmentation for Zero-Shot Cross-Lingual NLP“. In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/533.
Der volle Inhalt der QuelleChen, Zhiyu, Mohamed Trabelsi, Jeff Heflin, Yinan Xu und Brian D. Davison. „Table Search Using a Deep Contextualized Language Model“. In SIGIR '20: The 43rd International ACM SIGIR conference on research and development in Information Retrieval. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3397271.3401044.
Der volle Inhalt der QuelleLiu, Liyuan, Xiang Ren, Jingbo Shang, Xiaotao Gu, Jian Peng und Jiawei Han. „Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling“. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/d18-1153.
Der volle Inhalt der Quelle