Title | XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation |
Author | |
Corresponding Author | Guanhua Chen; Daxin Jiang |
Publication Years | 2022-12-07
|
Conference Name | The 2022 Conference on Empirical Methods in Natural Language Processing
|
Source Title | |
Pages | 6934–6946
|
Conference Date | 2022-12-7
|
Conference Place | Abu Dhabi
|
Abstract | Pre-training language models have achieved thriving success in numerous natural language understanding and autoregressive generation tasks, but non-autoregressive generation in applications such as machine translation has not sufficiently benefited from the pre-training paradigm. In this work, we establish the connection between a pre-trained masked language model (MLM) and non-autoregressive generation on machine translation. From this perspective, we present XLM-D, which seamlessly transforms an off-the-shelf cross-lingual pre-training model into a non-autoregressive translation (NAT) model with a lightweight yet effective decorator. Specifically, the decorator ensures the representation consistency of the pre-trained model and brings only one additional trainable parameter. Extensive experiments on typical translation datasets show that our models obtain state-of-the-art performance while realizing the inference speed-up by 19.9x. One striking result is that on WMT14 En-De, our XLM-D obtains 29.80 BLEU points with multiple iterations, which outperforms the previous mask-predict model by 2.77 points. |
SUSTech Authorship | Corresponding
|
Language | English
|
Data Source | 人工提交
|
PDF url | https://aclanthology.org/2022.emnlp-main.466/ |
Publication Status | 正式出版
|
Document Type | Conference paper |
Identifier | http://kc.sustech.edu.cn/handle/2SGJ60CL/524072 |
Department | Department of Statistics and Data Science |
Affiliation | 1.Tencent Corporation 2.Microsoft Corporation 3.Southern University of Science and Technology 4.Shanghai University of Finance and Economics |
Corresponding Author Affilication | Southern University of Science and Technology |
Recommended Citation GB/T 7714 |
Yong Wang,Shilin He,Guanhua Chen,et al. XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation[C],2022:6934–6946.
|
Files in This Item: | ||||||
File Name/Size | DocType | Version | Access | License | ||
2022.emnlp-main.466+(1294KB) | Restricted Access | -- |
|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment