Literatura académica sobre el tema "Multi-lingual training"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Multi-lingual training".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Multi-lingual training"
Chi, Zewen, Li Dong, Furu Wei, Wenhui Wang, Xian-Ling Mao y Heyan Huang. "Cross-Lingual Natural Language Generation via Pre-Training". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 05 (3 de abril de 2020): 7570–77. http://dx.doi.org/10.1609/aaai.v34i05.6256.
Texto completoCao, Yue, Xiaojun Wan, Jinge Yao y Dian Yu. "MultiSumm: Towards a Unified Model for Multi-Lingual Abstractive Summarization". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 01 (3 de abril de 2020): 11–18. http://dx.doi.org/10.1609/aaai.v34i01.5328.
Texto completoKovacic, Michael y Karl Cunningham. "Effective Electrical Safety Program Training in Multi-Lingual/Cultural Environments". IEEE Transactions on Industry Applications 55, n.º 4 (julio de 2019): 4384–88. http://dx.doi.org/10.1109/tia.2019.2907883.
Texto completoZhan, Qingran, Xiang Xie, Chenguang Hu, Juan Zuluaga-Gomez, Jing Wang y Haobo Cheng. "Domain-Adversarial Based Model with Phonological Knowledge for Cross-Lingual Speech Recognition". Electronics 10, n.º 24 (20 de diciembre de 2021): 3172. http://dx.doi.org/10.3390/electronics10243172.
Texto completoZhang, Mozhi, Yoshinari Fujinuma y Jordan Boyd-Graber. "Exploiting Cross-Lingual Subword Similarities in Low-Resource Document Classification". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 05 (3 de abril de 2020): 9547–54. http://dx.doi.org/10.1609/aaai.v34i05.6500.
Texto completoGuinzoni, Roberta. "Walgreens Boots Alliance goes multi-lingual through e-learning". Human Resource Management International Digest 23, n.º 7 (12 de octubre de 2015): 5–8. http://dx.doi.org/10.1108/hrmid-08-2015-0138.
Texto completoPinto da Costa, Mariana. "Conducting Cross-Cultural, Multi-Lingual and Multi-Country Focus Groups: Guidance for Researchers". International Journal of Qualitative Methods 20 (enero de 2021): 160940692110499. http://dx.doi.org/10.1177/16094069211049929.
Texto completoFuad, Ahlam y Maha Al-Yahya. "AraConv: Developing an Arabic Task-Oriented Dialogue System Using Multi-Lingual Transformer Model mT5". Applied Sciences 12, n.º 4 (11 de febrero de 2022): 1881. http://dx.doi.org/10.3390/app12041881.
Texto completoYan, Huijiong, Tao Qian, Liang Xie y Shanguang Chen. "Unsupervised cross-lingual model transfer for named entity recognition with contextualized word representations". PLOS ONE 16, n.º 9 (21 de septiembre de 2021): e0257230. http://dx.doi.org/10.1371/journal.pone.0257230.
Texto completoXiang, Lu, Junnan Zhu, Yang Zhao, Yu Zhou y Chengqing Zong. "Robust Cross-lingual Task-oriented Dialogue". ACM Transactions on Asian and Low-Resource Language Information Processing 20, n.º 6 (30 de noviembre de 2021): 1–24. http://dx.doi.org/10.1145/3457571.
Texto completoTesis sobre el tema "Multi-lingual training"
Dehouck, Mathieu. "Multi-lingual dependency parsing : word representation and joint training for syntactic analysis". Thesis, Lille 1, 2019. http://www.theses.fr/2019LIL1I019/document.
Texto completoWhile modern dependency parsers have become as good as human experts, they still rely heavily on hand annotated training examples which are available for a handful of languages only. Several methods such as model and annotation transfer have been proposed to make high quality syntactic analysis available to low resourced languages as well. In this thesis, we propose new approaches for sharing information across languages relying on their shared morphological features. In a fist time, we propose to use shared morphological features to induce cross-lingual delexicalised word representations that help learning syntactic analysis models. Then, we propose a new multi-task learning framework called phylogenetic learning which learns models for related tasks/languages guided by the tasks/languages evolutionary tree. Eventually, with our new measure of morphosyntactic complexity we investigate the intrinsic role of morphological information for dependency parsing
Anoop, C. S. "Automatic speech recognition for low-resource Indian languages". Thesis, 2023. https://etd.iisc.ac.in/handle/2005/6195.
Texto completoLibros sobre el tema "Multi-lingual training"
Chatrik, Balbir. The obstacle course: Experiences of multi-lingual trainees on youth training and employment training. London: Youthaid, 1992.
Buscar texto completoCapítulos de libros sobre el tema "Multi-lingual training"
Landon-Smith, Kristine y Chris Hay. "Empowering the somatically othered actor through multi-lingual improvisation in training". En Stages of Reckoning, 149–63. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003032076-12.
Texto completoNouza, Jan y Radek Safarik. "Parliament Archives Used for Automatic Training of Multi-lingual Automatic Speech Recognition Systems". En Text, Speech, and Dialogue, 174–82. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-64206-2_20.
Texto completoHagans, Kristi S. y Catherine Richards-Tutor. "Interdisciplinary Training in Intensive Intervention for Students With Disabilities and Multi-Lingual Youth". En Handbook of Research on Interdisciplinary Preparation for Equitable Special Education, 296–317. IGI Global, 2023. http://dx.doi.org/10.4018/978-1-6684-6438-0.ch015.
Texto completoTsurutani, Chiharu. "Computer-Assisted Pronunciation Training and Assessment (CAPTA) Programs". En Computer-Assisted Foreign Language Teaching and Learning, 276–88. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-2821-2.ch016.
Texto completoActas de conferencias sobre el tema "Multi-lingual training"
Li, Shicheng, Pengcheng Yang, Fuli Luo y Jun Xie. "Multi-Granularity Contrasting for Cross-Lingual Pre-Training". En Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.findings-acl.149.
Texto completoQin, Libo, Minheng Ni, Yue Zhang y Wanxiang Che. "CoSDA-ML: Multi-Lingual Code-Switching Data Augmentation for Zero-Shot Cross-Lingual NLP". En Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/533.
Texto completoSoky, Kak, Sheng Li, Tatsuya Kawahara y Sopheap Seng. "Multi-lingual Transformer Training for Khmer Automatic Speech Recognition". En 2019 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC). IEEE, 2019. http://dx.doi.org/10.1109/apsipaasc47483.2019.9023137.
Texto completoSaiko, Masahiro, Hitoshi Yamamoto, Ryosuke Isotani y Chiori Hori. "Efficient multi-lingual unsupervised acoustic model training under mismatch conditions". En 2014 IEEE Spoken Language Technology Workshop (SLT). IEEE, 2014. http://dx.doi.org/10.1109/slt.2014.7078544.
Texto completoGessler, Luke y Amir Zeldes. "MicroBERT: Effective Training of Low-resource Monolingual BERTs through Parameter Reduction and Multitask Learning". En Proceedings of the The 2nd Workshop on Multi-lingual Representation Learning (MRL). Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.mrl-1.9.
Texto completoConceição, Jhonatas Santos de Jesus, Allan Pinto, Luis Decker, Jose Luis Flores Campana, Manuel Cordova Neira, Andrezza A. Dos Santos, Helio Pedrini y Ricardo Torres. "Multi-Lingual Text Localization via Language-Specific Convolutional Neural Networks". En XXXII Conference on Graphics, Patterns and Images. Sociedade Brasileira de Computação - SBC, 2019. http://dx.doi.org/10.5753/sibgrapi.est.2019.8333.
Texto completoHe, Xiaodong, Li Deng, Dilek Hakkani-Tur y Gokhan Tur. "Multi-style adaptive training for robust cross-lingual spoken language understanding". En ICASSP 2013 - 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2013. http://dx.doi.org/10.1109/icassp.2013.6639292.
Texto completoMasumura, Ryo, Yusuke Shinohara, Ryuichiro Higashinaka y Yushi Aono. "Adversarial Training for Multi-task and Multi-lingual Joint Modeling of Utterance Intent Classification". En Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/d18-1064.
Texto completoLai, Siyu, Hui Huang, Dong Jing, Yufeng Chen, Jinan Xu y Jian Liu. "Saliency-based Multi-View Mixed Language Training for Zero-shot Cross-lingual Classification". En Findings of the Association for Computational Linguistics: EMNLP 2021. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.findings-emnlp.55.
Texto completoBarry, James, Joachim Wagner y Jennifer Foster. "Cross-lingual Parsing with Polyglot Training and Multi-treebank Learning: A Faroese Case Study". En Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019). Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/d19-6118.
Texto completo