Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „Non-autoregressive Machine Translation“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Non-autoregressive Machine Translation" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Non-autoregressive Machine Translation"
Wang, Yiren, Fei Tian, Di He, Tao Qin, ChengXiang Zhai und Tie-Yan Liu. „Non-Autoregressive Machine Translation with Auxiliary Regularization“. Proceedings of the AAAI Conference on Artificial Intelligence 33 (17.07.2019): 5377–84. http://dx.doi.org/10.1609/aaai.v33i01.33015377.
Der volle Inhalt der QuelleWang, Shuheng, Shumin Shi, Heyan Huang und Wei Zhang. „Improving Non-Autoregressive Machine Translation via Autoregressive Training“. Journal of Physics: Conference Series 2031, Nr. 1 (01.09.2021): 012045. http://dx.doi.org/10.1088/1742-6596/2031/1/012045.
Der volle Inhalt der QuelleShao, Chenze, Jinchao Zhang, Jie Zhou und Yang Feng. „Rephrasing the Reference for Non-autoregressive Machine Translation“. Proceedings of the AAAI Conference on Artificial Intelligence 37, Nr. 11 (26.06.2023): 13538–46. http://dx.doi.org/10.1609/aaai.v37i11.26587.
Der volle Inhalt der QuelleRan, Qiu, Yankai Lin, Peng Li und Jie Zhou. „Guiding Non-Autoregressive Neural Machine Translation Decoding with Reordering Information“. Proceedings of the AAAI Conference on Artificial Intelligence 35, Nr. 15 (18.05.2021): 13727–35. http://dx.doi.org/10.1609/aaai.v35i15.17618.
Der volle Inhalt der QuelleWang, Shuheng, Shumin Shi und Heyan Huang. „Enhanced encoder for non-autoregressive machine translation“. Machine Translation 35, Nr. 4 (16.11.2021): 595–609. http://dx.doi.org/10.1007/s10590-021-09285-x.
Der volle Inhalt der QuelleShao, Chenze, Jinchao Zhang, Yang Feng, Fandong Meng und Jie Zhou. „Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation“. Proceedings of the AAAI Conference on Artificial Intelligence 34, Nr. 01 (03.04.2020): 198–205. http://dx.doi.org/10.1609/aaai.v34i01.5351.
Der volle Inhalt der QuelleLi, Feng, Jingxian Chen und Xuejun Zhang. „A Survey of Non-Autoregressive Neural Machine Translation“. Electronics 12, Nr. 13 (06.07.2023): 2980. http://dx.doi.org/10.3390/electronics12132980.
Der volle Inhalt der QuelleLiu, Min, Yu Bao, Chengqi Zhao und Shujian Huang. „Selective Knowledge Distillation for Non-Autoregressive Neural Machine Translation“. Proceedings of the AAAI Conference on Artificial Intelligence 37, Nr. 11 (26.06.2023): 13246–54. http://dx.doi.org/10.1609/aaai.v37i11.26555.
Der volle Inhalt der QuelleDu, Quan, Kai Feng, Chen Xu, Tong Xiao und Jingbo Zhu. „Non-autoregressive neural machine translation with auxiliary representation fusion“. Journal of Intelligent & Fuzzy Systems 41, Nr. 6 (16.12.2021): 7229–39. http://dx.doi.org/10.3233/jifs-211105.
Der volle Inhalt der QuelleXinlu, Zhang, Wu Hongguan, Ma Beijiao und Zhai Zhengang. „Research on Low Resource Neural Machine Translation Based on Non-autoregressive Model“. Journal of Physics: Conference Series 2171, Nr. 1 (01.01.2022): 012045. http://dx.doi.org/10.1088/1742-6596/2171/1/012045.
Der volle Inhalt der QuelleDissertationen zum Thema "Non-autoregressive Machine Translation"
Xu, Jitao. „Writing in two languages : Neural machine translation as an assistive bilingual writing tool“. Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPASG078.
Der volle Inhalt der QuelleIn an increasingly global world, more situations appear where people need to express themselves in a foreign language or multiple languages. However, for many people, writing in a foreign language is not an easy task. Machine translation tools can help generate texts in multiple languages. With the tangible progress in neural machine translation (NMT), translation technologies are delivering usable translations in a growing number of contexts. However, it is not yet realistic for NMT systems to produce error-free translations. Therefore, users with a good command of a given foreign language may find assistance from computer-aided translation technologies. In case of difficulties, users writing in a foreign language can access external resources such as dictionaries, terminologies, or bilingual concordancers. However, consulting these resources causes an interruption in the writing process and starts another cognitive activity. To make the process smoother, it is possible to extend writing assistant systems to support bilingual text composition. However, existing studies mainly focused on generating texts in a foreign language. We suggest that showing corresponding texts in the user's mother tongue can also help users to verify the composed texts with synchronized bitexts. In this thesis, we study techniques to build bilingual writing assistant systems that allow free composition in both languages and display synchronized monolingual texts in the two languages. We introduce two types of simulated interactive systems. The first solution allows users to compose mixed-language texts, which are then translated into their monolingual counterparts. We propose a dual decoder Transformer model comprising a shared encoder and two decoders to simultaneously produce texts in two languages. We also explore the dual decoder model for various other tasks, such as multi-target translation, bidirectional translation, generating translation variants, and multilingual subtitling. The second design aims to extend commercial online translation systems by letting users freely alternate between the two languages, changing the texting input box at their will. In this scenario, the technical challenge is to keep the two input texts synchronized while taking the users' inputs into account, again with the goal of authoring two equally good versions of the text. For this, we introduce a general bilingual synchronization task and implement and experiment with autoregressive and non-autoregressive synchronization systems. We also investigate bilingual synchronization models on specific downstream tasks, such as parallel corpus cleaning and NMT with translation memories, to study the generalization ability of the proposed models
Buchteile zum Thema "Non-autoregressive Machine Translation"
Zhou, Long, Jiajun Zhang, Yang Zhao und Chengqing Zong. „Non-autoregressive Neural Machine Translation with Distortion Model“. In Natural Language Processing and Chinese Computing, 403–15. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60450-9_32.
Der volle Inhalt der QuelleWang, Shuheng, Shumin Shi und Heyan Huang. „Improving Non-autoregressive Machine Translation with Soft-Masking“. In Natural Language Processing and Chinese Computing, 141–52. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-88480-2_12.
Der volle Inhalt der QuelleGuo, Ziyue, Hongxu Hou, Nier Wu und Shuo Sun. „Word-Level Error Correction in Non-autoregressive Neural Machine Translation“. In Communications in Computer and Information Science, 726–33. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63820-7_83.
Der volle Inhalt der QuelleWang, Yisong, Hongxu Hou, Shuo Sun, Nier Wu, Weichen Jian, Zongheng Yang und Pengcong Wang. „Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation“. In Communications in Computer and Information Science, 72–81. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-7960-6_8.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Non-autoregressive Machine Translation"
Bao, Guangsheng, Zhiyang Teng, Hao Zhou, Jianhao Yan und Yue Zhang. „Non-Autoregressive Document-Level Machine Translation“. In Findings of the Association for Computational Linguistics: EMNLP 2023. Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.findings-emnlp.986.
Der volle Inhalt der QuelleXu, Jitao, Josep Crego und François Yvon. „Integrating Translation Memories into Non-Autoregressive Machine Translation“. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.eacl-main.96.
Der volle Inhalt der QuelleSaharia, Chitwan, William Chan, Saurabh Saxena und Mohammad Norouzi. „Non-Autoregressive Machine Translation with Latent Alignments“. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.emnlp-main.83.
Der volle Inhalt der QuelleWei, Bingzhen, Mingxuan Wang, Hao Zhou, Junyang Lin und Xu Sun. „Imitation Learning for Non-Autoregressive Neural Machine Translation“. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/p19-1125.
Der volle Inhalt der QuelleQian, Lihua, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Weinan Zhang, Yong Yu und Lei Li. „Glancing Transformer for Non-Autoregressive Neural Machine Translation“. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.acl-long.155.
Der volle Inhalt der QuelleShan, Yong, Yang Feng und Chenze Shao. „Modeling Coverage for Non-Autoregressive Neural Machine Translation“. In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533529.
Der volle Inhalt der QuelleLi, Zhuohan, Zi Lin, Di He, Fei Tian, Tao Qin, Liwei Wang und Tie-Yan Liu. „Hint-Based Training for Non-Autoregressive Machine Translation“. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/d19-1573.
Der volle Inhalt der QuelleCheng, Hao, und Zhihua Zhang. „Con-NAT: Contrastive Non-autoregressive Neural Machine Translation“. In Findings of the Association for Computational Linguistics: EMNLP 2022. Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.findings-emnlp.463.
Der volle Inhalt der QuelleHuang, Chenyang, Fei Huang, Zaixiang Zheng, Osmar Zaïane, Hao Zhou und Lili Mou. „Multilingual Non-Autoregressive Machine Translation without Knowledge Distillation“. In Findings of the Association for Computational Linguistics: IJCNLP-AACL 2023 (Findings). Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.findings-ijcnlp.14.
Der volle Inhalt der QuelleShao, Chenze, Yang Feng, Jinchao Zhang, Fandong Meng, Xilin Chen und Jie Zhou. „Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation“. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/p19-1288.
Der volle Inhalt der Quelle