Academic literature on the topic 'Non-autoregressive Machine Translation'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Non-autoregressive Machine Translation.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Non-autoregressive Machine Translation"
Wang, Yiren, Fei Tian, Di He, Tao Qin, ChengXiang Zhai, and Tie-Yan Liu. "Non-Autoregressive Machine Translation with Auxiliary Regularization." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5377–84. http://dx.doi.org/10.1609/aaai.v33i01.33015377.
Full textWang, Shuheng, Shumin Shi, Heyan Huang, and Wei Zhang. "Improving Non-Autoregressive Machine Translation via Autoregressive Training." Journal of Physics: Conference Series 2031, no. 1 (September 1, 2021): 012045. http://dx.doi.org/10.1088/1742-6596/2031/1/012045.
Full textShao, Chenze, Jinchao Zhang, Jie Zhou, and Yang Feng. "Rephrasing the Reference for Non-autoregressive Machine Translation." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 11 (June 26, 2023): 13538–46. http://dx.doi.org/10.1609/aaai.v37i11.26587.
Full textRan, Qiu, Yankai Lin, Peng Li, and Jie Zhou. "Guiding Non-Autoregressive Neural Machine Translation Decoding with Reordering Information." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 15 (May 18, 2021): 13727–35. http://dx.doi.org/10.1609/aaai.v35i15.17618.
Full textWang, Shuheng, Shumin Shi, and Heyan Huang. "Enhanced encoder for non-autoregressive machine translation." Machine Translation 35, no. 4 (November 16, 2021): 595–609. http://dx.doi.org/10.1007/s10590-021-09285-x.
Full textShao, Chenze, Jinchao Zhang, Yang Feng, Fandong Meng, and Jie Zhou. "Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 01 (April 3, 2020): 198–205. http://dx.doi.org/10.1609/aaai.v34i01.5351.
Full textLi, Feng, Jingxian Chen, and Xuejun Zhang. "A Survey of Non-Autoregressive Neural Machine Translation." Electronics 12, no. 13 (July 6, 2023): 2980. http://dx.doi.org/10.3390/electronics12132980.
Full textLiu, Min, Yu Bao, Chengqi Zhao, and Shujian Huang. "Selective Knowledge Distillation for Non-Autoregressive Neural Machine Translation." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 11 (June 26, 2023): 13246–54. http://dx.doi.org/10.1609/aaai.v37i11.26555.
Full textDu, Quan, Kai Feng, Chen Xu, Tong Xiao, and Jingbo Zhu. "Non-autoregressive neural machine translation with auxiliary representation fusion." Journal of Intelligent & Fuzzy Systems 41, no. 6 (December 16, 2021): 7229–39. http://dx.doi.org/10.3233/jifs-211105.
Full textXinlu, Zhang, Wu Hongguan, Ma Beijiao, and Zhai Zhengang. "Research on Low Resource Neural Machine Translation Based on Non-autoregressive Model." Journal of Physics: Conference Series 2171, no. 1 (January 1, 2022): 012045. http://dx.doi.org/10.1088/1742-6596/2171/1/012045.
Full textDissertations / Theses on the topic "Non-autoregressive Machine Translation"
Xu, Jitao. "Writing in two languages : Neural machine translation as an assistive bilingual writing tool." Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPASG078.
Full textIn an increasingly global world, more situations appear where people need to express themselves in a foreign language or multiple languages. However, for many people, writing in a foreign language is not an easy task. Machine translation tools can help generate texts in multiple languages. With the tangible progress in neural machine translation (NMT), translation technologies are delivering usable translations in a growing number of contexts. However, it is not yet realistic for NMT systems to produce error-free translations. Therefore, users with a good command of a given foreign language may find assistance from computer-aided translation technologies. In case of difficulties, users writing in a foreign language can access external resources such as dictionaries, terminologies, or bilingual concordancers. However, consulting these resources causes an interruption in the writing process and starts another cognitive activity. To make the process smoother, it is possible to extend writing assistant systems to support bilingual text composition. However, existing studies mainly focused on generating texts in a foreign language. We suggest that showing corresponding texts in the user's mother tongue can also help users to verify the composed texts with synchronized bitexts. In this thesis, we study techniques to build bilingual writing assistant systems that allow free composition in both languages and display synchronized monolingual texts in the two languages. We introduce two types of simulated interactive systems. The first solution allows users to compose mixed-language texts, which are then translated into their monolingual counterparts. We propose a dual decoder Transformer model comprising a shared encoder and two decoders to simultaneously produce texts in two languages. We also explore the dual decoder model for various other tasks, such as multi-target translation, bidirectional translation, generating translation variants, and multilingual subtitling. The second design aims to extend commercial online translation systems by letting users freely alternate between the two languages, changing the texting input box at their will. In this scenario, the technical challenge is to keep the two input texts synchronized while taking the users' inputs into account, again with the goal of authoring two equally good versions of the text. For this, we introduce a general bilingual synchronization task and implement and experiment with autoregressive and non-autoregressive synchronization systems. We also investigate bilingual synchronization models on specific downstream tasks, such as parallel corpus cleaning and NMT with translation memories, to study the generalization ability of the proposed models
Book chapters on the topic "Non-autoregressive Machine Translation"
Zhou, Long, Jiajun Zhang, Yang Zhao, and Chengqing Zong. "Non-autoregressive Neural Machine Translation with Distortion Model." In Natural Language Processing and Chinese Computing, 403–15. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60450-9_32.
Full textWang, Shuheng, Shumin Shi, and Heyan Huang. "Improving Non-autoregressive Machine Translation with Soft-Masking." In Natural Language Processing and Chinese Computing, 141–52. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-88480-2_12.
Full textGuo, Ziyue, Hongxu Hou, Nier Wu, and Shuo Sun. "Word-Level Error Correction in Non-autoregressive Neural Machine Translation." In Communications in Computer and Information Science, 726–33. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63820-7_83.
Full textWang, Yisong, Hongxu Hou, Shuo Sun, Nier Wu, Weichen Jian, Zongheng Yang, and Pengcong Wang. "Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation." In Communications in Computer and Information Science, 72–81. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-7960-6_8.
Full textConference papers on the topic "Non-autoregressive Machine Translation"
Bao, Guangsheng, Zhiyang Teng, Hao Zhou, Jianhao Yan, and Yue Zhang. "Non-Autoregressive Document-Level Machine Translation." In Findings of the Association for Computational Linguistics: EMNLP 2023. Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.findings-emnlp.986.
Full textXu, Jitao, Josep Crego, and François Yvon. "Integrating Translation Memories into Non-Autoregressive Machine Translation." In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.eacl-main.96.
Full textSaharia, Chitwan, William Chan, Saurabh Saxena, and Mohammad Norouzi. "Non-Autoregressive Machine Translation with Latent Alignments." In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.emnlp-main.83.
Full textWei, Bingzhen, Mingxuan Wang, Hao Zhou, Junyang Lin, and Xu Sun. "Imitation Learning for Non-Autoregressive Neural Machine Translation." In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/p19-1125.
Full textQian, Lihua, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Weinan Zhang, Yong Yu, and Lei Li. "Glancing Transformer for Non-Autoregressive Neural Machine Translation." In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.acl-long.155.
Full textShan, Yong, Yang Feng, and Chenze Shao. "Modeling Coverage for Non-Autoregressive Neural Machine Translation." In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533529.
Full textLi, Zhuohan, Zi Lin, Di He, Fei Tian, Tao Qin, Liwei Wang, and Tie-Yan Liu. "Hint-Based Training for Non-Autoregressive Machine Translation." In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/d19-1573.
Full textCheng, Hao, and Zhihua Zhang. "Con-NAT: Contrastive Non-autoregressive Neural Machine Translation." In Findings of the Association for Computational Linguistics: EMNLP 2022. Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.findings-emnlp.463.
Full textHuang, Chenyang, Fei Huang, Zaixiang Zheng, Osmar Zaïane, Hao Zhou, and Lili Mou. "Multilingual Non-Autoregressive Machine Translation without Knowledge Distillation." In Findings of the Association for Computational Linguistics: IJCNLP-AACL 2023 (Findings). Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.findings-ijcnlp.14.
Full textShao, Chenze, Yang Feng, Jinchao Zhang, Fandong Meng, Xilin Chen, and Jie Zhou. "Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation." In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/p19-1288.
Full text