Academic literature on the topic 'Contextualized Language Models'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Contextualized Language Models.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Contextualized Language Models"
Myagmar, Batsergelen, Jie Li, and Shigetomo Kimura. "Cross-Domain Sentiment Classification With Bidirectional Contextualized Transformer Language Models." IEEE Access 7 (2019): 163219–30. http://dx.doi.org/10.1109/access.2019.2952360.
Full textEl Adlouni, Yassine, Noureddine En Nahnahi, Said Ouatik El Alaoui, Mohammed Meknassi, Horacio Rodríguez, and Nabil Alami. "Arabic Biomedical Community Question Answering Based on Contextualized Embeddings." International Journal of Intelligent Information Technologies 17, no. 3 (July 2021): 13–29. http://dx.doi.org/10.4018/ijiit.2021070102.
Full textZhou, Xuhui, Yue Zhang, Leyang Cui, and Dandan Huang. "Evaluating Commonsense in Pre-Trained Language Models." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 9733–40. http://dx.doi.org/10.1609/aaai.v34i05.6523.
Full textYan, Huijiong, Tao Qian, Liang Xie, and Shanguang Chen. "Unsupervised cross-lingual model transfer for named entity recognition with contextualized word representations." PLOS ONE 16, no. 9 (September 21, 2021): e0257230. http://dx.doi.org/10.1371/journal.pone.0257230.
Full textSchumacher, Elliot, and Mark Dredze. "Learning unsupervised contextual representations for medical synonym discovery." JAMIA Open 2, no. 4 (November 4, 2019): 538–46. http://dx.doi.org/10.1093/jamiaopen/ooz057.
Full textSchick, Timo, and Hinrich Schütze. "Rare Words: A Major Problem for Contextualized Embeddings and How to Fix it by Attentive Mimicking." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 8766–74. http://dx.doi.org/10.1609/aaai.v34i05.6403.
Full textStrokach, Alexey, Tian Yu Lu, and Philip M. Kim. "ELASPIC2 (EL2): Combining Contextualized Language Models and Graph Neural Networks to Predict Effects of Mutations." Journal of Molecular Biology 433, no. 11 (May 2021): 166810. http://dx.doi.org/10.1016/j.jmb.2021.166810.
Full textDev, Sunipa, Tao Li, Jeff M. Phillips, and Vivek Srikumar. "On Measuring and Mitigating Biased Inferences of Word Embeddings." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 7659–66. http://dx.doi.org/10.1609/aaai.v34i05.6267.
Full textGarí Soler, Aina, and Marianna Apidianaki. "Let’s Play Mono-Poly: BERT Can Reveal Words’ Polysemy Level and Partitionability into Senses." Transactions of the Association for Computational Linguistics 9 (2021): 825–44. http://dx.doi.org/10.1162/tacl_a_00400.
Full textSaha, Koustuv, Ted Grover, Stephen M. Mattingly, Vedant Das swain, Pranshu Gupta, Gonzalo J. Martinez, Pablo Robles-Granda, Gloria Mark, Aaron Striegel, and Munmun De Choudhury. "Person-Centered Predictions of Psychological Constructs with Social Media Contextualized by Multimodal Sensing." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, no. 1 (March 19, 2021): 1–32. http://dx.doi.org/10.1145/3448117.
Full textDissertations / Theses on the topic "Contextualized Language Models"
Borggren, Lukas. "Automatic Categorization of News Articles With Contextualized Language Models." Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177004.
Full textPortnoff, Scott R. "(1) The case for using foreign language pedagogies in introductory computer programming instruction (2) A contextualized pre-AP computer programming curriculum| Models and simulations for exploring real-world cross-curricular topics." Thesis, California State University, Los Angeles, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10132126.
Full textLarge numbers of novice programmers have been failing postsecondary introductory computer science programming (CS1) courses for nearly four decades. Student learning is much worse in secondary programming courses of similar or even lesser rigor. This has critical implications for efforts to reclassify Computer Science (CS) as a core secondary subject. State departments of education have little incentive to do so until it can be demonstrated that most grade-level students will not only pass such classes, but will be well-prepared to succeed in subsequent vertically aligned coursework.
One rarely considered cause for such massive failure is insufficient pedagogic attention to teaching a programming language (PL) as a language, per se. Students who struggle with acquiring proficiency in using a PL can be likened to students who flounder in a French class due to a poor grasp of the language's syntactic or semantic features. Though natural languages (NL) and PLs differ in many key respects, a recently reported (2014) fMRI study has demonstrated that comprehension of computer programs primarily utilizes regions of the brain involved in language processing, not math. The implications for CS pedagogy are that, if PLs are learned in ways fundamentally similar to how second languages (L2) are acquired, foreign language pedagogies (FLP) and second language acquisition (SLA) theories can be key sources for informing the crafting of effective PL teaching strategies.
In this regard, key features of contemporary L2 pedagogies relevant to effective PL instruction—reflecting the late 20th-century shift in emphasis from cognitive learning that stressed grammatical knowledge, to one that facilitates communication and practical uses of the language—are: (1) repetitive and comprehensible input in a variety of contexts, and (2) motivated, meaningful communication and interaction.
Informed by these principles, four language-based strategies adapted for PL instruction are described, the first to help students acquire syntax and three others for learning semantics: (a) memorization; (b) setting components in relief; (c) transformations; and (d) ongoing exposure.
Anecdotal observations in my classroom have long indicated that memorization of small programs and program fragments can immediately and drastically reduce the occurrence of syntax errors among novice pre-AP Java programming students. A modest first experiment attempting to confirm the effect was statistically unconvincing: for students most likely to struggle, the Pearson coefficient of −0.474 (p < 0.064) suggested a low-modest inverse correlation. A follow-up study will be better designed. Still, a possible explanation for the anecdotal phenomenon is that the repetition required for proficient memorization activates the same subconscious language acquisition processes that construct NL grammars when learners are exposed to language data.
Dismal retention rates subsequent to the introductory programming course have historically also been a persistent problem. One key factor impacting attrition is a student's intrinsic motivation, which is shaped both by interest in, and self-efficacy with regards to, the subject matter. Interest involves not just CS concepts, but also context, the domains used to illustrate how one can apply those concepts. One way to tap into a wide range of student interests is to demonstrate the capacity of CS to explore, model, simulate and solve non-trivial problems in domains across the academic spectrum, fields that students already value and whose basic concepts they already understand.
An original University of California "G" elective (UCOP "a-g" approved) pre-AP programming course along these principles is described. In this graphics-based Processing course, students are guided through the process of writing and studying small dynamic art programs, progressing to mid-size graphics programs that model or simulate real-world problems and phenomena in geography, biology, political science and astronomy. The contextualized course content combined with the language-specific strategies outlined above address both interest and self-efficacy. Although anecdotally these appear to have a positive effect on student understanding and retention, studies need to be done on a larger scale to validate these outcomes.
Finally, a critique is offered of the movement to replace rigorous secondary programming instruction with survey courses—particularly Exploring Computer Science and APCS Principles—under the guise of "democratizing" secondary CS education or to address the severe and persistent demographic disparities. This group of educators has promulgated a nonsensical fiction that programming is simply one of many subdisciplines of the field, rather than the core skill needed to understand all other CS topics in any deep and meaningful way. These courses present a facade of mitigating demographic disparities, but leave participants no better prepared for subsequent CS study.
Books on the topic "Contextualized Language Models"
W, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 2nd ed. Boston, Mass: Heinle & Heinle, 2000.
Find full textW, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 2nd ed. Boston, Mass: Heinle & Heinle, 2000.
Find full textW, Glisan Eileen, ed. Teacher's handbook: Contextualized language instruction. 3rd ed. Southbank, Victoria, Australia: Thomson, Heinle, 2005.
Find full textShrum, Judith L., and Eileen W. Glisan. Teacher's Handbook: Contextualized Language Instruction. 3rd ed. Heinle, 2004.
Find full textShrum, Judith L., and Eileen W. Glisan. Teacher's Handbook: Contextualized Language Instruction. 2nd ed. Heinle & Heinle Publishers, 1999.
Find full textBalbo, Andrea. Traces of Actio in Fragmentary Roman Orators. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198788201.003.0014.
Full textMcNaughton, James. “It all boils down to a question of words”. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198822547.003.0006.
Full textElior, Rachel. Jewish Mysticism. Translated by Arthur B. Millman. Liverpool University Press, 2007. http://dx.doi.org/10.3828/liverpool/9781874774679.001.0001.
Full textJones, Chris. Fossil Poetry. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198824527.001.0001.
Full textBook chapters on the topic "Contextualized Language Models"
Straka, Milan, Jakub Náplava, Jana Straková, and David Samuel. "RobeCzech: Czech RoBERTa, a Monolingual Contextualized Language Representation Model." In Text, Speech, and Dialogue, 197–209. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-83527-9_17.
Full textSarhan, Injy, and Marco R. Spruit. "Contextualized Word Embeddings in a Neural Open Information Extraction Model." In Natural Language Processing and Information Systems, 359–67. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-23281-8_31.
Full textHarrison, S. J. "The Poetics of Fiction: Poetic Influence on the Language of Apuleius’ Metamorphoses." In Aspects of the Language of Latin Prose. British Academy, 2005. http://dx.doi.org/10.5871/bacad/9780197263327.003.0013.
Full textHancı-Azizoglu, Eda Başak, and Nurdan Kavaklı. "Creative Digital Writing." In Digital Pedagogies and the Transformation of Language Education, 250–66. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-6745-6.ch013.
Full textMartinez, Martha I., Anya Hurwitz, Jennifer Analla, Laurie Olsen, and Joanna Meadvin. "Cultivating Rich Language Development, Deep Learning, and Joyful Classrooms for English Learners." In Handbook of Research on Engaging Immigrant Families and Promoting Academic Success for English Language Learners, 269–93. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8283-0.ch014.
Full textDavies, Joshua. "The language of gesture: Untimely bodies and contemporary performance." In Visions and ruins. Manchester University Press, 2018. http://dx.doi.org/10.7228/manchester/9781526125934.003.0005.
Full textGreen, Sharon E., and Mason Gordon. "Teaching Literacy through Technology in the Middle School." In Academic Knowledge Construction and Multimodal Curriculum Development, 230–42. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4797-8.ch014.
Full textMartin, Alison E. "Styling Science." In Nature Translated, 22–39. Edinburgh University Press, 2018. http://dx.doi.org/10.3366/edinburgh/9781474439329.003.0002.
Full textBrowne, Craig, and Andrew P. Lynch. "Introduction." In Taylor and Politics, 1–16. Edinburgh University Press, 2018. http://dx.doi.org/10.3366/edinburgh/9780748691937.003.0001.
Full textPerler, Dominik. "The Alienation Effect in the Historiography of Philosophy." In Philosophy and the Historical Perspective, edited by Marcel van Ackeren and Lee Klein, 140–54. British Academy, 2018. http://dx.doi.org/10.5871/bacad/9780197266298.003.0009.
Full textConference papers on the topic "Contextualized Language Models"
Shaghaghian, Shohreh, Luna Yue Feng, Borna Jafarpour, and Nicolai Pogrebnyakov. "Customizing Contextualized Language Models for Legal Document Reviews." In 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2020. http://dx.doi.org/10.1109/bigdata50022.2020.9378201.
Full textRoss, Hayley, Jonathon Cai, and Bonan Min. "Exploring Contextualized Neural Language Models for Temporal Dependency Parsing." In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.emnlp-main.689.
Full textOkamoto, Takayoshi, Tetsuya Honda, and Koji Eguchi. "Locally contextualized smoothing of language models for sentiment sentence retrieval." In Proceeding of the 1st international CIKM workshop. New York, New York, USA: ACM Press, 2009. http://dx.doi.org/10.1145/1651461.1651475.
Full textHe, Bin, Di Zhou, Jinghui Xiao, Xin Jiang, Qun Liu, Nicholas Jing Yuan, and Tong Xu. "BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models." In Findings of the Association for Computational Linguistics: EMNLP 2020. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.findings-emnlp.207.
Full textWang, Yixiao, Zied Bouraoui, Luis Espinosa Anke, and Steven Schockaert. "Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention Selection." In Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021). Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.repl4nlp-1.19.
Full textGinzburg, Dvir, Itzik Malkiel, Oren Barkan, Avi Caciularu, and Noam Koenigstein. "Self-Supervised Document Similarity Ranking via Contextualized Language Models and Hierarchical Inference." In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.findings-acl.272.
Full textMinh Le, Thao, Vuong Le, Svetha Venkatesh, and Truyen Tran. "Dynamic Language Binding in Relational Visual Reasoning." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/114.
Full textQin, Libo, Minheng Ni, Yue Zhang, and Wanxiang Che. "CoSDA-ML: Multi-Lingual Code-Switching Data Augmentation for Zero-Shot Cross-Lingual NLP." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/533.
Full textChen, Zhiyu, Mohamed Trabelsi, Jeff Heflin, Yinan Xu, and Brian D. Davison. "Table Search Using a Deep Contextualized Language Model." In SIGIR '20: The 43rd International ACM SIGIR conference on research and development in Information Retrieval. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3397271.3401044.
Full textLiu, Liyuan, Xiang Ren, Jingbo Shang, Xiaotao Gu, Jian Peng, and Jiawei Han. "Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling." In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/d18-1153.
Full text