Journal articles on the topic 'Neural language models'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Neural language models.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Buckman, Jacob, and Graham Neubig. "Neural Lattice Language Models." Transactions of the Association for Computational Linguistics 6 (December 2018): 529–41. http://dx.doi.org/10.1162/tacl_a_00036.
Full textBengio, Yoshua. "Neural net language models." Scholarpedia 3, no. 1 (2008): 3881. http://dx.doi.org/10.4249/scholarpedia.3881.
Full textDong, Li. "Learning natural language interfaces with neural models." AI Matters 7, no. 2 (2021): 14–17. http://dx.doi.org/10.1145/3478369.3478375.
Full textDe Coster, Mathieu, and Joni Dambre. "Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation." Information 13, no. 5 (2022): 220. http://dx.doi.org/10.3390/info13050220.
Full textMandy Lau. "Artificial intelligence language models and the false fantasy of participatory language policies." Working papers in Applied Linguistics and Linguistics at York 1 (September 13, 2021): 4–15. http://dx.doi.org/10.25071/2564-2855.5.
Full textChang, Tyler A., and Benjamin K. Bergen. "Word Acquisition in Neural Language Models." Transactions of the Association for Computational Linguistics 10 (2022): 1–16. http://dx.doi.org/10.1162/tacl_a_00444.
Full textMezzoudj, Freha, and Abdelkader Benyettou. "An empirical study of statistical language models: n-gram language models vs. neural network language models." International Journal of Innovative Computing and Applications 9, no. 4 (2018): 189. http://dx.doi.org/10.1504/ijica.2018.095762.
Full textMezzoudj, Freha, and Abdelkader Benyettou. "An empirical study of statistical language models: n-gram language models vs. neural network language models." International Journal of Innovative Computing and Applications 9, no. 4 (2018): 189. http://dx.doi.org/10.1504/ijica.2018.10016827.
Full textQi, Kunxun, and Jianfeng Du. "Translation-Based Matching Adversarial Network for Cross-Lingual Natural Language Inference." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 8632–39. http://dx.doi.org/10.1609/aaai.v34i05.6387.
Full textAngius, Nicola, Pietro Perconti, Alessio Plebe, and Alessandro Acciai. "The Simulative Role of Neural Language Models in Brain Language Processing." Philosophies 9, no. 5 (2024): 137. http://dx.doi.org/10.3390/philosophies9050137.
Full textHale, John T., Luca Campanelli, Jixing Li, Shohini Bhattasali, Christophe Pallier, and Jonathan R. Brennan. "Neurocomputational Models of Language Processing." Annual Review of Linguistics 8, no. 1 (2022): 427–46. http://dx.doi.org/10.1146/annurev-linguistics-051421-020803.
Full textKlemen, Matej, and Slavko Zitnik. "Neural coreference resolution for Slovene language." Computer Science and Information Systems, no. 00 (2021): 60. http://dx.doi.org/10.2298/csis201120060k.
Full textLytvynov, A., P. Andreicheva, V. Bredikhin, and V. Verbytska. "DEVELOPMENT TENDENCIES OF GENERATION MODELS OF THE UKRAINIAN LANGUAGE." Municipal economy of cities 3, no. 184 (2024): 10–15. http://dx.doi.org/10.33042/2522-1809-2024-3-184-10-15.
Full textPark, Myung-Kwan, Keonwoo Koo, Jaemin Lee, and Wonil Chung. "Investigating Syntactic Transfer from English to Korean in Neural L2 Language Models." Studies in Modern Grammar 121 (March 30, 2024): 177–201. http://dx.doi.org/10.14342/smog.2024.121.177.
Full textKunchukuttan, Anoop, Mitesh Khapra, Gurneet Singh, and Pushpak Bhattacharyya. "Leveraging Orthographic Similarity for Multilingual Neural Transliteration." Transactions of the Association for Computational Linguistics 6 (December 2018): 303–16. http://dx.doi.org/10.1162/tacl_a_00022.
Full textBayer, Ali Orkan, and Giuseppe Riccardi. "Semantic language models with deep neural networks." Computer Speech & Language 40 (November 2016): 1–22. http://dx.doi.org/10.1016/j.csl.2016.04.001.
Full textChuchupal, V. Y. "Neural language models for automatic speech Recognition." Речевые технологии, no. 1-2 (2020): 27–47. http://dx.doi.org/10.58633/2305-8129_2020_1-2_27.
Full textTian, Yijun, Huan Song, Zichen Wang, et al. "Graph Neural Prompting with Large Language Models." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 17 (2024): 19080–88. http://dx.doi.org/10.1609/aaai.v38i17.29875.
Full textSchomacker, Thorben, and Marina Tropmann-Frick. "Language Representation Models: An Overview." Entropy 23, no. 11 (2021): 1422. http://dx.doi.org/10.3390/e23111422.
Full textTakahashi, Shuntaro, and Kumiko Tanaka-Ishii. "Evaluating Computational Language Models with Scaling Properties of Natural Language." Computational Linguistics 45, no. 3 (2019): 481–513. http://dx.doi.org/10.1162/coli_a_00355.
Full textOba, Miyu. "Research Background on Second Language Acquisition in Neural Language Models." Journal of Natural Language Processing 32, no. 2 (2025): 684–90. https://doi.org/10.5715/jnlp.32.684.
Full textMartin, Andrea E. "A Compositional Neural Architecture for Language." Journal of Cognitive Neuroscience 32, no. 8 (2020): 1407–27. http://dx.doi.org/10.1162/jocn_a_01552.
Full textMukhamadiyev, Abdinabi, Mukhriddin Mukhiddinov, Ilyos Khujayarov, Mannon Ochilov, and Jinsoo Cho. "Development of Language Models for Continuous Uzbek Speech Recognition System." Sensors 23, no. 3 (2023): 1145. http://dx.doi.org/10.3390/s23031145.
Full textNaveenkumar, T. Rudrappa, V. Reddy Mallamma, and Hanumanthappa M. "KHiTE: Multilingual Speech Acquisition to Monolingual Text Translation." Indian Journal of Science and Technology 16, no. 21 (2023): 1572–79. https://doi.org/10.17485/IJST/v16i21.727.
Full textPenner, Regina V. "Large Language Models: А Socio-Philosophical Essay". Galactica Media: Journal of Media Studies 6, № 3 (2024): 83–100. http://dx.doi.org/10.46539/gmd.v6i3.502.
Full textHafeez, Rabab, Muhammad Waqas Anwar, Muhammad Hasan Jamal, et al. "Contextual Urdu Lemmatization Using Recurrent Neural Network Models." Mathematics 11, no. 2 (2023): 435. http://dx.doi.org/10.3390/math11020435.
Full textOralbekova, Dina, Orken Mamyrbayev, Mohamed Othman, Dinara Kassymova, and Kuralai Mukhsina. "Contemporary Approaches in Evolving Language Models." Applied Sciences 13, no. 23 (2023): 12901. http://dx.doi.org/10.3390/app132312901.
Full textYogatama, Dani, Cyprien de Masson d’Autume, and Lingpeng Kong. "Adaptive Semiparametric Language Models." Transactions of the Association for Computational Linguistics 9 (2021): 362–73. http://dx.doi.org/10.1162/tacl_a_00371.
Full textConstantinescu, Ionut, Tiago Pimentel, Ryan Cotterell, and Alex Warstadt. "Investigating Critical Period Effects in Language Acquisition through Neural Language Models." Transactions of the Association for Computational Linguistics 13 (January 24, 2024): 96–120. https://doi.org/10.1162/tacl_a_00725.
Full textTinn, Robert, Hao Cheng, Yu Gu, et al. "Fine-tuning large neural language models for biomedical natural language processing." Patterns 4, no. 4 (2023): 100729. http://dx.doi.org/10.1016/j.patter.2023.100729.
Full textChoi, Sunjoo, Myung-Kwan Park, and Euhee Kim. "How are Korean Neural Language Models ‘surprised’ Layerwisely?" Journal of Language Sciences 28, no. 4 (2021): 301–17. http://dx.doi.org/10.14384/kals.2021.28.4.301.
Full textZhang, Peng, Wenjie Hui, Benyou Wang, et al. "Complex-valued Neural Network-based Quantum Language Models." ACM Transactions on Information Systems 40, no. 4 (2022): 1–31. http://dx.doi.org/10.1145/3505138.
Full textLee, Jaemin, and Jeong-Ah Shin. "Evaluating L2 Training Methods in Neural Language Models." Lanaguage Research 60, no. 3 (2024): 323–45. https://doi.org/10.30961/lr.2024.60.3.323.
Full textTanaka, Tomohiro, Ryo Masumura, and Takanobu Oba. "Neural candidate-aware language models for speech recognition." Computer Speech & Language 66 (March 2021): 101157. http://dx.doi.org/10.1016/j.csl.2020.101157.
Full textKong, Weirui, Hyeju Jang, Giuseppe Carenini, and Thalia S. Field. "Exploring neural models for predicting dementia from language." Computer Speech & Language 68 (July 2021): 101181. http://dx.doi.org/10.1016/j.csl.2020.101181.
Full textPhan, Tien D., and Nur Zincir‐Heywood. "User identification via neural network based language models." International Journal of Network Management 29, no. 3 (2018): e2049. http://dx.doi.org/10.1002/nem.2049.
Full textKaryukin, Vladislav, Diana Rakhimova, Aidana Karibayeva, Aliya Turganbayeva, and Asem Turarbek. "The neural machine translation models for the low-resource Kazakh–English language pair." PeerJ Computer Science 9 (February 8, 2023): e1224. http://dx.doi.org/10.7717/peerj-cs.1224.
Full textLai, Yihan. "Enhancing Linguistic Bridges: Seq2seq Models and the Future of Machine Translation." Highlights in Science, Engineering and Technology 111 (August 19, 2024): 410–14. https://doi.org/10.54097/pf2xsr76.
Full textBudaya, I. Gede Bintang Arya, Made Windu Antara Kesiman, and I. Made Gede Sunarya. "The Influence of Word Vectorization for Kawi Language to Indonesian Language Neural Machine Translation." Journal of Information Technology and Computer Science 7, no. 1 (2022): 81–93. http://dx.doi.org/10.25126/jitecs.202271387.
Full textStudenikina, Kseniia Andreevna. "Evaluation of neural models’ linguistic competence: evidence from Russian predicate agreement." Proceedings of the Institute for System Programming of the RAS 34, no. 6 (2022): 178–84. http://dx.doi.org/10.15514/ispras-2022-34(6)-14.
Full textMeijer, Erik. "Virtual Machinations: Using Large Language Models as Neural Computers." Queue 22, no. 3 (2024): 25–52. http://dx.doi.org/10.1145/3676287.
Full textRane, Kirti, Tanaya Bagwe,, Shruti Chaudhari, Ankita Kale, and Gayatri Deore. "Enhancing En-X Translation: A Chrome Extension-Based Approach to Indic Language Models." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 03 (2025): 1–9. https://doi.org/10.55041/ijsrem42782.
Full textGoldberg, Yoav. "A Primer on Neural Network Models for Natural Language Processing." Journal of Artificial Intelligence Research 57 (November 20, 2016): 345–420. http://dx.doi.org/10.1613/jair.4992.
Full textZhao, Xiaodong, Rouyi Fan, and Wanyue Liu. "Research on Transformer-Based Multilingual Machine Translation Methods." Journal of Intelligence and Knowledge Engineering 3, no. 1 (2025): 57–67. https://doi.org/10.62517/jike.202504108.
Full textDeepak Mane. "Transformer based Neural Network Architecturefor Regional Language Translation." Advances in Nonlinear Variational Inequalities 28, no. 3s (2024): 211–25. https://doi.org/10.52783/anvi.v28.2925.
Full textJabar, H. Yousif. "Neural Computing based Part of Speech Tagger for Arabic Language: A review study." International Journal of Computation and Applied Sciences IJOCAAS 1, no. 5 (2020): 361–65. https://doi.org/10.5281/zenodo.4002418.
Full textWu, Yi-Chao, Fei Yin, and Cheng-Lin Liu. "Improving handwritten Chinese text recognition using neural network language models and convolutional neural network shape models." Pattern Recognition 65 (May 2017): 251–64. http://dx.doi.org/10.1016/j.patcog.2016.12.026.
Full textBabić, Karlo, Sanda Martinčić-Ipšić, and Ana Meštrović. "Survey of Neural Text Representation Models." Information 11, no. 11 (2020): 511. http://dx.doi.org/10.3390/info11110511.
Full textMuhammad, Murad, Shahzad Muhammad, and Fareed Naheeda. "Research Comparative Analysis of OCR Models for Urdu Language Characters Recognition." LC International Journal of STEM 5, no. 3 (2024): 55–63. https://doi.org/10.5281/zenodo.14028816.
Full textHahn, Michael. "Theoretical Limitations of Self-Attention in Neural Sequence Models." Transactions of the Association for Computational Linguistics 8 (July 2020): 156–71. http://dx.doi.org/10.1162/tacl_a_00306.
Full text