Academic literature on the topic 'Multi-lingual training'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Multi-lingual training.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Multi-lingual training"
Chi, Zewen, Li Dong, Furu Wei, Wenhui Wang, Xian-Ling Mao, and Heyan Huang. "Cross-Lingual Natural Language Generation via Pre-Training." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 7570–77. http://dx.doi.org/10.1609/aaai.v34i05.6256.
Full textCao, Yue, Xiaojun Wan, Jinge Yao, and Dian Yu. "MultiSumm: Towards a Unified Model for Multi-Lingual Abstractive Summarization." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 01 (April 3, 2020): 11–18. http://dx.doi.org/10.1609/aaai.v34i01.5328.
Full textKovacic, Michael, and Karl Cunningham. "Effective Electrical Safety Program Training in Multi-Lingual/Cultural Environments." IEEE Transactions on Industry Applications 55, no. 4 (July 2019): 4384–88. http://dx.doi.org/10.1109/tia.2019.2907883.
Full textZhan, Qingran, Xiang Xie, Chenguang Hu, Juan Zuluaga-Gomez, Jing Wang, and Haobo Cheng. "Domain-Adversarial Based Model with Phonological Knowledge for Cross-Lingual Speech Recognition." Electronics 10, no. 24 (December 20, 2021): 3172. http://dx.doi.org/10.3390/electronics10243172.
Full textZhang, Mozhi, Yoshinari Fujinuma, and Jordan Boyd-Graber. "Exploiting Cross-Lingual Subword Similarities in Low-Resource Document Classification." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (April 3, 2020): 9547–54. http://dx.doi.org/10.1609/aaai.v34i05.6500.
Full textGuinzoni, Roberta. "Walgreens Boots Alliance goes multi-lingual through e-learning." Human Resource Management International Digest 23, no. 7 (October 12, 2015): 5–8. http://dx.doi.org/10.1108/hrmid-08-2015-0138.
Full textPinto da Costa, Mariana. "Conducting Cross-Cultural, Multi-Lingual and Multi-Country Focus Groups: Guidance for Researchers." International Journal of Qualitative Methods 20 (January 2021): 160940692110499. http://dx.doi.org/10.1177/16094069211049929.
Full textFuad, Ahlam, and Maha Al-Yahya. "AraConv: Developing an Arabic Task-Oriented Dialogue System Using Multi-Lingual Transformer Model mT5." Applied Sciences 12, no. 4 (February 11, 2022): 1881. http://dx.doi.org/10.3390/app12041881.
Full textYan, Huijiong, Tao Qian, Liang Xie, and Shanguang Chen. "Unsupervised cross-lingual model transfer for named entity recognition with contextualized word representations." PLOS ONE 16, no. 9 (September 21, 2021): e0257230. http://dx.doi.org/10.1371/journal.pone.0257230.
Full textXiang, Lu, Junnan Zhu, Yang Zhao, Yu Zhou, and Chengqing Zong. "Robust Cross-lingual Task-oriented Dialogue." ACM Transactions on Asian and Low-Resource Language Information Processing 20, no. 6 (November 30, 2021): 1–24. http://dx.doi.org/10.1145/3457571.
Full textDissertations / Theses on the topic "Multi-lingual training"
Dehouck, Mathieu. "Multi-lingual dependency parsing : word representation and joint training for syntactic analysis." Thesis, Lille 1, 2019. http://www.theses.fr/2019LIL1I019/document.
Full textWhile modern dependency parsers have become as good as human experts, they still rely heavily on hand annotated training examples which are available for a handful of languages only. Several methods such as model and annotation transfer have been proposed to make high quality syntactic analysis available to low resourced languages as well. In this thesis, we propose new approaches for sharing information across languages relying on their shared morphological features. In a fist time, we propose to use shared morphological features to induce cross-lingual delexicalised word representations that help learning syntactic analysis models. Then, we propose a new multi-task learning framework called phylogenetic learning which learns models for related tasks/languages guided by the tasks/languages evolutionary tree. Eventually, with our new measure of morphosyntactic complexity we investigate the intrinsic role of morphological information for dependency parsing
Anoop, C. S. "Automatic speech recognition for low-resource Indian languages." Thesis, 2023. https://etd.iisc.ac.in/handle/2005/6195.
Full textBooks on the topic "Multi-lingual training"
Chatrik, Balbir. The obstacle course: Experiences of multi-lingual trainees on youth training and employment training. London: Youthaid, 1992.
Find full textBook chapters on the topic "Multi-lingual training"
Landon-Smith, Kristine, and Chris Hay. "Empowering the somatically othered actor through multi-lingual improvisation in training." In Stages of Reckoning, 149–63. London: Routledge, 2022. http://dx.doi.org/10.4324/9781003032076-12.
Full textNouza, Jan, and Radek Safarik. "Parliament Archives Used for Automatic Training of Multi-lingual Automatic Speech Recognition Systems." In Text, Speech, and Dialogue, 174–82. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-64206-2_20.
Full textHagans, Kristi S., and Catherine Richards-Tutor. "Interdisciplinary Training in Intensive Intervention for Students With Disabilities and Multi-Lingual Youth." In Handbook of Research on Interdisciplinary Preparation for Equitable Special Education, 296–317. IGI Global, 2023. http://dx.doi.org/10.4018/978-1-6684-6438-0.ch015.
Full textTsurutani, Chiharu. "Computer-Assisted Pronunciation Training and Assessment (CAPTA) Programs." In Computer-Assisted Foreign Language Teaching and Learning, 276–88. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-2821-2.ch016.
Full textConference papers on the topic "Multi-lingual training"
Li, Shicheng, Pengcheng Yang, Fuli Luo, and Jun Xie. "Multi-Granularity Contrasting for Cross-Lingual Pre-Training." In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.findings-acl.149.
Full textQin, Libo, Minheng Ni, Yue Zhang, and Wanxiang Che. "CoSDA-ML: Multi-Lingual Code-Switching Data Augmentation for Zero-Shot Cross-Lingual NLP." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/533.
Full textSoky, Kak, Sheng Li, Tatsuya Kawahara, and Sopheap Seng. "Multi-lingual Transformer Training for Khmer Automatic Speech Recognition." In 2019 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC). IEEE, 2019. http://dx.doi.org/10.1109/apsipaasc47483.2019.9023137.
Full textSaiko, Masahiro, Hitoshi Yamamoto, Ryosuke Isotani, and Chiori Hori. "Efficient multi-lingual unsupervised acoustic model training under mismatch conditions." In 2014 IEEE Spoken Language Technology Workshop (SLT). IEEE, 2014. http://dx.doi.org/10.1109/slt.2014.7078544.
Full textGessler, Luke, and Amir Zeldes. "MicroBERT: Effective Training of Low-resource Monolingual BERTs through Parameter Reduction and Multitask Learning." In Proceedings of the The 2nd Workshop on Multi-lingual Representation Learning (MRL). Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.mrl-1.9.
Full textConceição, Jhonatas Santos de Jesus, Allan Pinto, Luis Decker, Jose Luis Flores Campana, Manuel Cordova Neira, Andrezza A. Dos Santos, Helio Pedrini, and Ricardo Torres. "Multi-Lingual Text Localization via Language-Specific Convolutional Neural Networks." In XXXII Conference on Graphics, Patterns and Images. Sociedade Brasileira de Computação - SBC, 2019. http://dx.doi.org/10.5753/sibgrapi.est.2019.8333.
Full textHe, Xiaodong, Li Deng, Dilek Hakkani-Tur, and Gokhan Tur. "Multi-style adaptive training for robust cross-lingual spoken language understanding." In ICASSP 2013 - 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2013. http://dx.doi.org/10.1109/icassp.2013.6639292.
Full textMasumura, Ryo, Yusuke Shinohara, Ryuichiro Higashinaka, and Yushi Aono. "Adversarial Training for Multi-task and Multi-lingual Joint Modeling of Utterance Intent Classification." In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/d18-1064.
Full textLai, Siyu, Hui Huang, Dong Jing, Yufeng Chen, Jinan Xu, and Jian Liu. "Saliency-based Multi-View Mixed Language Training for Zero-shot Cross-lingual Classification." In Findings of the Association for Computational Linguistics: EMNLP 2021. Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.findings-emnlp.55.
Full textBarry, James, Joachim Wagner, and Jennifer Foster. "Cross-lingual Parsing with Polyglot Training and Multi-treebank Learning: A Faroese Case Study." In Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019). Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/d19-6118.
Full text