Literatura científica selecionada sobre o tema "Non-autoregressive Machine Translation"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Non-autoregressive Machine Translation".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Non-autoregressive Machine Translation"
Wang, Yiren, Fei Tian, Di He, Tao Qin, ChengXiang Zhai e Tie-Yan Liu. "Non-Autoregressive Machine Translation with Auxiliary Regularization". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17 de julho de 2019): 5377–84. http://dx.doi.org/10.1609/aaai.v33i01.33015377.
Texto completo da fonteWang, Shuheng, Shumin Shi, Heyan Huang e Wei Zhang. "Improving Non-Autoregressive Machine Translation via Autoregressive Training". Journal of Physics: Conference Series 2031, n.º 1 (1 de setembro de 2021): 012045. http://dx.doi.org/10.1088/1742-6596/2031/1/012045.
Texto completo da fonteShao, Chenze, Jinchao Zhang, Jie Zhou e Yang Feng. "Rephrasing the Reference for Non-autoregressive Machine Translation". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 11 (26 de junho de 2023): 13538–46. http://dx.doi.org/10.1609/aaai.v37i11.26587.
Texto completo da fonteRan, Qiu, Yankai Lin, Peng Li e Jie Zhou. "Guiding Non-Autoregressive Neural Machine Translation Decoding with Reordering Information". Proceedings of the AAAI Conference on Artificial Intelligence 35, n.º 15 (18 de maio de 2021): 13727–35. http://dx.doi.org/10.1609/aaai.v35i15.17618.
Texto completo da fonteWang, Shuheng, Shumin Shi e Heyan Huang. "Enhanced encoder for non-autoregressive machine translation". Machine Translation 35, n.º 4 (16 de novembro de 2021): 595–609. http://dx.doi.org/10.1007/s10590-021-09285-x.
Texto completo da fonteShao, Chenze, Jinchao Zhang, Yang Feng, Fandong Meng e Jie Zhou. "Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 01 (3 de abril de 2020): 198–205. http://dx.doi.org/10.1609/aaai.v34i01.5351.
Texto completo da fonteLi, Feng, Jingxian Chen e Xuejun Zhang. "A Survey of Non-Autoregressive Neural Machine Translation". Electronics 12, n.º 13 (6 de julho de 2023): 2980. http://dx.doi.org/10.3390/electronics12132980.
Texto completo da fonteLiu, Min, Yu Bao, Chengqi Zhao e Shujian Huang. "Selective Knowledge Distillation for Non-Autoregressive Neural Machine Translation". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 11 (26 de junho de 2023): 13246–54. http://dx.doi.org/10.1609/aaai.v37i11.26555.
Texto completo da fonteDu, Quan, Kai Feng, Chen Xu, Tong Xiao e Jingbo Zhu. "Non-autoregressive neural machine translation with auxiliary representation fusion". Journal of Intelligent & Fuzzy Systems 41, n.º 6 (16 de dezembro de 2021): 7229–39. http://dx.doi.org/10.3233/jifs-211105.
Texto completo da fonteXinlu, Zhang, Wu Hongguan, Ma Beijiao e Zhai Zhengang. "Research on Low Resource Neural Machine Translation Based on Non-autoregressive Model". Journal of Physics: Conference Series 2171, n.º 1 (1 de janeiro de 2022): 012045. http://dx.doi.org/10.1088/1742-6596/2171/1/012045.
Texto completo da fonteTeses / dissertações sobre o assunto "Non-autoregressive Machine Translation"
Xu, Jitao. "Writing in two languages : Neural machine translation as an assistive bilingual writing tool". Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPASG078.
Texto completo da fonteIn an increasingly global world, more situations appear where people need to express themselves in a foreign language or multiple languages. However, for many people, writing in a foreign language is not an easy task. Machine translation tools can help generate texts in multiple languages. With the tangible progress in neural machine translation (NMT), translation technologies are delivering usable translations in a growing number of contexts. However, it is not yet realistic for NMT systems to produce error-free translations. Therefore, users with a good command of a given foreign language may find assistance from computer-aided translation technologies. In case of difficulties, users writing in a foreign language can access external resources such as dictionaries, terminologies, or bilingual concordancers. However, consulting these resources causes an interruption in the writing process and starts another cognitive activity. To make the process smoother, it is possible to extend writing assistant systems to support bilingual text composition. However, existing studies mainly focused on generating texts in a foreign language. We suggest that showing corresponding texts in the user's mother tongue can also help users to verify the composed texts with synchronized bitexts. In this thesis, we study techniques to build bilingual writing assistant systems that allow free composition in both languages and display synchronized monolingual texts in the two languages. We introduce two types of simulated interactive systems. The first solution allows users to compose mixed-language texts, which are then translated into their monolingual counterparts. We propose a dual decoder Transformer model comprising a shared encoder and two decoders to simultaneously produce texts in two languages. We also explore the dual decoder model for various other tasks, such as multi-target translation, bidirectional translation, generating translation variants, and multilingual subtitling. The second design aims to extend commercial online translation systems by letting users freely alternate between the two languages, changing the texting input box at their will. In this scenario, the technical challenge is to keep the two input texts synchronized while taking the users' inputs into account, again with the goal of authoring two equally good versions of the text. For this, we introduce a general bilingual synchronization task and implement and experiment with autoregressive and non-autoregressive synchronization systems. We also investigate bilingual synchronization models on specific downstream tasks, such as parallel corpus cleaning and NMT with translation memories, to study the generalization ability of the proposed models
Capítulos de livros sobre o assunto "Non-autoregressive Machine Translation"
Zhou, Long, Jiajun Zhang, Yang Zhao e Chengqing Zong. "Non-autoregressive Neural Machine Translation with Distortion Model". In Natural Language Processing and Chinese Computing, 403–15. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60450-9_32.
Texto completo da fonteWang, Shuheng, Shumin Shi e Heyan Huang. "Improving Non-autoregressive Machine Translation with Soft-Masking". In Natural Language Processing and Chinese Computing, 141–52. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-88480-2_12.
Texto completo da fonteGuo, Ziyue, Hongxu Hou, Nier Wu e Shuo Sun. "Word-Level Error Correction in Non-autoregressive Neural Machine Translation". In Communications in Computer and Information Science, 726–33. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-63820-7_83.
Texto completo da fonteWang, Yisong, Hongxu Hou, Shuo Sun, Nier Wu, Weichen Jian, Zongheng Yang e Pengcong Wang. "Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation". In Communications in Computer and Information Science, 72–81. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-7960-6_8.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Non-autoregressive Machine Translation"
Bao, Guangsheng, Zhiyang Teng, Hao Zhou, Jianhao Yan e Yue Zhang. "Non-Autoregressive Document-Level Machine Translation". In Findings of the Association for Computational Linguistics: EMNLP 2023. Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.findings-emnlp.986.
Texto completo da fonteXu, Jitao, Josep Crego e François Yvon. "Integrating Translation Memories into Non-Autoregressive Machine Translation". In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.eacl-main.96.
Texto completo da fonteSaharia, Chitwan, William Chan, Saurabh Saxena e Mohammad Norouzi. "Non-Autoregressive Machine Translation with Latent Alignments". In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.emnlp-main.83.
Texto completo da fonteWei, Bingzhen, Mingxuan Wang, Hao Zhou, Junyang Lin e Xu Sun. "Imitation Learning for Non-Autoregressive Neural Machine Translation". In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/p19-1125.
Texto completo da fonteQian, Lihua, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Weinan Zhang, Yong Yu e Lei Li. "Glancing Transformer for Non-Autoregressive Neural Machine Translation". In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2021. http://dx.doi.org/10.18653/v1/2021.acl-long.155.
Texto completo da fonteShan, Yong, Yang Feng e Chenze Shao. "Modeling Coverage for Non-Autoregressive Neural Machine Translation". In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533529.
Texto completo da fonteLi, Zhuohan, Zi Lin, Di He, Fei Tian, Tao Qin, Liwei Wang e Tie-Yan Liu. "Hint-Based Training for Non-Autoregressive Machine Translation". In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/d19-1573.
Texto completo da fonteCheng, Hao, e Zhihua Zhang. "Con-NAT: Contrastive Non-autoregressive Neural Machine Translation". In Findings of the Association for Computational Linguistics: EMNLP 2022. Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.findings-emnlp.463.
Texto completo da fonteHuang, Chenyang, Fei Huang, Zaixiang Zheng, Osmar Zaïane, Hao Zhou e Lili Mou. "Multilingual Non-Autoregressive Machine Translation without Knowledge Distillation". In Findings of the Association for Computational Linguistics: IJCNLP-AACL 2023 (Findings). Stroudsburg, PA, USA: Association for Computational Linguistics, 2023. http://dx.doi.org/10.18653/v1/2023.findings-ijcnlp.14.
Texto completo da fonteShao, Chenze, Yang Feng, Jinchao Zhang, Fandong Meng, Xilin Chen e Jie Zhou. "Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation". In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/p19-1288.
Texto completo da fonte