Artigos de revistas sobre o tema "Neural language models"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Veja os 50 melhores artigos de revistas para estudos sobre o assunto "Neural language models".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Veja os artigos de revistas das mais diversas áreas científicas e compile uma bibliografia correta.
Buckman, Jacob, e Graham Neubig. "Neural Lattice Language Models". Transactions of the Association for Computational Linguistics 6 (dezembro de 2018): 529–41. http://dx.doi.org/10.1162/tacl_a_00036.
Texto completo da fonteBengio, Yoshua. "Neural net language models". Scholarpedia 3, n.º 1 (2008): 3881. http://dx.doi.org/10.4249/scholarpedia.3881.
Texto completo da fonteDong, Li. "Learning natural language interfaces with neural models". AI Matters 7, n.º 2 (junho de 2021): 14–17. http://dx.doi.org/10.1145/3478369.3478375.
Texto completo da fonteDe Coster, Mathieu, e Joni Dambre. "Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation". Information 13, n.º 5 (23 de abril de 2022): 220. http://dx.doi.org/10.3390/info13050220.
Texto completo da fonteChang, Tyler A., e Benjamin K. Bergen. "Word Acquisition in Neural Language Models". Transactions of the Association for Computational Linguistics 10 (2022): 1–16. http://dx.doi.org/10.1162/tacl_a_00444.
Texto completo da fonteMezzoudj, Freha, e Abdelkader Benyettou. "An empirical study of statistical language models: n-gram language models vs. neural network language models". International Journal of Innovative Computing and Applications 9, n.º 4 (2018): 189. http://dx.doi.org/10.1504/ijica.2018.095762.
Texto completo da fonteMezzoudj, Freha, e Abdelkader Benyettou. "An empirical study of statistical language models: n-gram language models vs. neural network language models". International Journal of Innovative Computing and Applications 9, n.º 4 (2018): 189. http://dx.doi.org/10.1504/ijica.2018.10016827.
Texto completo da fonteMandy Lau. "Artificial intelligence language models and the false fantasy of participatory language policies". Working papers in Applied Linguistics and Linguistics at York 1 (13 de setembro de 2021): 4–15. http://dx.doi.org/10.25071/2564-2855.5.
Texto completo da fonteQi, Kunxun, e Jianfeng Du. "Translation-Based Matching Adversarial Network for Cross-Lingual Natural Language Inference". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 05 (3 de abril de 2020): 8632–39. http://dx.doi.org/10.1609/aaai.v34i05.6387.
Texto completo da fontePark, Myung-Kwan, Keonwoo Koo, Jaemin Lee e Wonil Chung. "Investigating Syntactic Transfer from English to Korean in Neural L2 Language Models". Studies in Modern Grammar 121 (30 de março de 2024): 177–201. http://dx.doi.org/10.14342/smog.2024.121.177.
Texto completo da fonteBayer, Ali Orkan, e Giuseppe Riccardi. "Semantic language models with deep neural networks". Computer Speech & Language 40 (novembro de 2016): 1–22. http://dx.doi.org/10.1016/j.csl.2016.04.001.
Texto completo da fonteChuchupal, V. Y. "Neural language models for automatic speech Recognition". Речевые технологии, n.º 1-2 (2020): 27–47. http://dx.doi.org/10.58633/2305-8129_2020_1-2_27.
Texto completo da fonteTian, Yijun, Huan Song, Zichen Wang, Haozhu Wang, Ziqing Hu, Fang Wang, Nitesh V. Chawla e Panpan Xu. "Graph Neural Prompting with Large Language Models". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 17 (24 de março de 2024): 19080–88. http://dx.doi.org/10.1609/aaai.v38i17.29875.
Texto completo da fonteHale, John T., Luca Campanelli, Jixing Li, Shohini Bhattasali, Christophe Pallier e Jonathan R. Brennan. "Neurocomputational Models of Language Processing". Annual Review of Linguistics 8, n.º 1 (14 de janeiro de 2022): 427–46. http://dx.doi.org/10.1146/annurev-linguistics-051421-020803.
Texto completo da fonteKlemen, Matej, e Slavko Zitnik. "Neural coreference resolution for Slovene language". Computer Science and Information Systems, n.º 00 (2021): 60. http://dx.doi.org/10.2298/csis201120060k.
Texto completo da fonteSchomacker, Thorben, e Marina Tropmann-Frick. "Language Representation Models: An Overview". Entropy 23, n.º 11 (28 de outubro de 2021): 1422. http://dx.doi.org/10.3390/e23111422.
Texto completo da fonteKunchukuttan, Anoop, Mitesh Khapra, Gurneet Singh e Pushpak Bhattacharyya. "Leveraging Orthographic Similarity for Multilingual Neural Transliteration". Transactions of the Association for Computational Linguistics 6 (dezembro de 2018): 303–16. http://dx.doi.org/10.1162/tacl_a_00022.
Texto completo da fonteTakahashi, Shuntaro, e Kumiko Tanaka-Ishii. "Evaluating Computational Language Models with Scaling Properties of Natural Language". Computational Linguistics 45, n.º 3 (setembro de 2019): 481–513. http://dx.doi.org/10.1162/coli_a_00355.
Texto completo da fonteMartin, Andrea E. "A Compositional Neural Architecture for Language". Journal of Cognitive Neuroscience 32, n.º 8 (agosto de 2020): 1407–27. http://dx.doi.org/10.1162/jocn_a_01552.
Texto completo da fonteMukhamadiyev, Abdinabi, Mukhriddin Mukhiddinov, Ilyos Khujayarov, Mannon Ochilov e Jinsoo Cho. "Development of Language Models for Continuous Uzbek Speech Recognition System". Sensors 23, n.º 3 (19 de janeiro de 2023): 1145. http://dx.doi.org/10.3390/s23031145.
Texto completo da fonteOralbekova, Dina, Orken Mamyrbayev, Mohamed Othman, Dinara Kassymova e Kuralai Mukhsina. "Contemporary Approaches in Evolving Language Models". Applied Sciences 13, n.º 23 (1 de dezembro de 2023): 12901. http://dx.doi.org/10.3390/app132312901.
Texto completo da fonteHafeez, Rabab, Muhammad Waqas Anwar, Muhammad Hasan Jamal, Tayyaba Fatima, Julio César Martínez Espinosa, Luis Alonso Dzul López, Ernesto Bautista Thompson e Imran Ashraf. "Contextual Urdu Lemmatization Using Recurrent Neural Network Models". Mathematics 11, n.º 2 (13 de janeiro de 2023): 435. http://dx.doi.org/10.3390/math11020435.
Texto completo da fonteYogatama, Dani, Cyprien de Masson d’Autume e Lingpeng Kong. "Adaptive Semiparametric Language Models". Transactions of the Association for Computational Linguistics 9 (2021): 362–73. http://dx.doi.org/10.1162/tacl_a_00371.
Texto completo da fonteTinn, Robert, Hao Cheng, Yu Gu, Naoto Usuyama, Xiaodong Liu, Tristan Naumann, Jianfeng Gao e Hoifung Poon. "Fine-tuning large neural language models for biomedical natural language processing". Patterns 4, n.º 4 (abril de 2023): 100729. http://dx.doi.org/10.1016/j.patter.2023.100729.
Texto completo da fonteChoi, Sunjoo, Myung-Kwan Park e Euhee Kim. "How are Korean Neural Language Models ‘surprised’ Layerwisely?" Journal of Language Sciences 28, n.º 4 (30 de novembro de 2021): 301–17. http://dx.doi.org/10.14384/kals.2021.28.4.301.
Texto completo da fonteZhang, Peng, Wenjie Hui, Benyou Wang, Donghao Zhao, Dawei Song, Christina Lioma e Jakob Grue Simonsen. "Complex-valued Neural Network-based Quantum Language Models". ACM Transactions on Information Systems 40, n.º 4 (31 de outubro de 2022): 1–31. http://dx.doi.org/10.1145/3505138.
Texto completo da fonteTanaka, Tomohiro, Ryo Masumura e Takanobu Oba. "Neural candidate-aware language models for speech recognition". Computer Speech & Language 66 (março de 2021): 101157. http://dx.doi.org/10.1016/j.csl.2020.101157.
Texto completo da fonteKong, Weirui, Hyeju Jang, Giuseppe Carenini e Thalia S. Field. "Exploring neural models for predicting dementia from language". Computer Speech & Language 68 (julho de 2021): 101181. http://dx.doi.org/10.1016/j.csl.2020.101181.
Texto completo da fontePhan, Tien D., e Nur Zincir‐Heywood. "User identification via neural network based language models". International Journal of Network Management 29, n.º 3 (30 de outubro de 2018): e2049. http://dx.doi.org/10.1002/nem.2049.
Texto completo da fonteKaryukin, Vladislav, Diana Rakhimova, Aidana Karibayeva, Aliya Turganbayeva e Asem Turarbek. "The neural machine translation models for the low-resource Kazakh–English language pair". PeerJ Computer Science 9 (8 de fevereiro de 2023): e1224. http://dx.doi.org/10.7717/peerj-cs.1224.
Texto completo da fonteBudaya, I. Gede Bintang Arya, Made Windu Antara Kesiman e I. Made Gede Sunarya. "The Influence of Word Vectorization for Kawi Language to Indonesian Language Neural Machine Translation". Journal of Information Technology and Computer Science 7, n.º 1 (29 de setembro de 2022): 81–93. http://dx.doi.org/10.25126/jitecs.202271387.
Texto completo da fonteWu, Yi-Chao, Fei Yin e Cheng-Lin Liu. "Improving handwritten Chinese text recognition using neural network language models and convolutional neural network shape models". Pattern Recognition 65 (maio de 2017): 251–64. http://dx.doi.org/10.1016/j.patcog.2016.12.026.
Texto completo da fonteStudenikina, Kseniia Andreevna. "Evaluation of neural models’ linguistic competence: evidence from Russian predicate agreement". Proceedings of the Institute for System Programming of the RAS 34, n.º 6 (2022): 178–84. http://dx.doi.org/10.15514/ispras-2022-34(6)-14.
Texto completo da fonteGoldberg, Yoav. "A Primer on Neural Network Models for Natural Language Processing". Journal of Artificial Intelligence Research 57 (20 de novembro de 2016): 345–420. http://dx.doi.org/10.1613/jair.4992.
Texto completo da fonteBabić, Karlo, Sanda Martinčić-Ipšić e Ana Meštrović. "Survey of Neural Text Representation Models". Information 11, n.º 11 (30 de outubro de 2020): 511. http://dx.doi.org/10.3390/info11110511.
Texto completo da fonteHahn, Michael. "Theoretical Limitations of Self-Attention in Neural Sequence Models". Transactions of the Association for Computational Linguistics 8 (julho de 2020): 156–71. http://dx.doi.org/10.1162/tacl_a_00306.
Texto completo da fonteYoo, YongSuk, e Kang-moon Park. "Developing Language-Specific Models Using a Neural Architecture Search". Applied Sciences 11, n.º 21 (3 de novembro de 2021): 10324. http://dx.doi.org/10.3390/app112110324.
Texto completo da fonteCangelosi, Angelo. "The emergence of language: neural and adaptive agent models". Connection Science 17, n.º 3-4 (setembro de 2005): 185–90. http://dx.doi.org/10.1080/09540090500177471.
Texto completo da fonteZamora-Martínez, F., V. Frinken, S. España-Boquera, M. J. Castro-Bleda, A. Fischer e H. Bunke. "Neural network language models for off-line handwriting recognition". Pattern Recognition 47, n.º 4 (abril de 2014): 1642–52. http://dx.doi.org/10.1016/j.patcog.2013.10.020.
Texto completo da fonteShi, Yangyang, Martha Larson, Joris Pelemans, Catholijn M. Jonker, Patrick Wambacq, Pascal Wiggers e Kris Demuynck. "Integrating meta-information into recurrent neural network language models". Speech Communication 73 (outubro de 2015): 64–80. http://dx.doi.org/10.1016/j.specom.2015.06.006.
Texto completo da fonteLalrempuii, Candy, Badal Soni e Partha Pakray. "An Improved English-to-Mizo Neural Machine Translation". ACM Transactions on Asian and Low-Resource Language Information Processing 20, n.º 4 (26 de maio de 2021): 1–21. http://dx.doi.org/10.1145/3445974.
Texto completo da fonteAnanthanarayana, Tejaswini, Priyanshu Srivastava, Akash Chintha, Akhil Santha, Brian Landy, Joseph Panaro, Andre Webster et al. "Deep Learning Methods for Sign Language Translation". ACM Transactions on Accessible Computing 14, n.º 4 (31 de dezembro de 2021): 1–30. http://dx.doi.org/10.1145/3477498.
Texto completo da fonteP., Dr Karrupusamy. "Analysis of Neural Network Based Language Modeling". March 2020 2, n.º 1 (30 de março de 2020): 53–63. http://dx.doi.org/10.36548/jaicn.2020.1.006.
Texto completo da fonteP., Dr Karrupusamy. "Analysis of Neural Network Based Language Modeling". March 2020 2, n.º 1 (30 de março de 2020): 53–63. http://dx.doi.org/10.36548/jaicn.2020.3.006.
Texto completo da fonteArisoy, Ebru, Stanley F. Chen, Bhuvana Ramabhadran e Abhinav Sethy. "Converting Neural Network Language Models into Back-off Language Models for Efficient Decoding in Automatic Speech Recognition". IEEE/ACM Transactions on Audio, Speech, and Language Processing 22, n.º 1 (janeiro de 2014): 184–92. http://dx.doi.org/10.1109/taslp.2013.2286919.
Texto completo da fonteRijhwani, Shruti, Jiateng Xie, Graham Neubig e Jaime Carbonell. "Zero-Shot Neural Transfer for Cross-Lingual Entity Linking". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17 de julho de 2019): 6924–31. http://dx.doi.org/10.1609/aaai.v33i01.33016924.
Texto completo da fonteDemeter, David, e Doug Downey. "Just Add Functions: A Neural-Symbolic Language Model". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 05 (3 de abril de 2020): 7634–42. http://dx.doi.org/10.1609/aaai.v34i05.6264.
Texto completo da fonteKipyatkova, Irina, e Ildar Kagirov. "Deep Models for Low-Resourced Speech Recognition: Livvi-Karelian Case". Mathematics 11, n.º 18 (5 de setembro de 2023): 3814. http://dx.doi.org/10.3390/math11183814.
Texto completo da fonteGerz, Daniela, Ivan Vulić, Edoardo Ponti, Jason Naradowsky, Roi Reichart e Anna Korhonen. "Language Modeling for Morphologically Rich Languages: Character-Aware Modeling for Word-Level Prediction". Transactions of the Association for Computational Linguistics 6 (dezembro de 2018): 451–65. http://dx.doi.org/10.1162/tacl_a_00032.
Texto completo da fonteJohnson, Melvin, Mike Schuster, Quoc V. Le, Maxim Krikun, Yonghui Wu, Zhifeng Chen, Nikhil Thorat et al. "Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation". Transactions of the Association for Computational Linguistics 5 (dezembro de 2017): 339–51. http://dx.doi.org/10.1162/tacl_a_00065.
Texto completo da fonte