A Simple and Effective Approach to Coverage-Aware Neural Machine Translation

Yanyang Li, Tong Xiao, Yinqiao Li, Qiang Wang, Changming Xu, Jingbo Zhu


Abstract
We offer a simple and effective method to seek a better balance between model confidence and length preference for Neural Machine Translation (NMT). Unlike the popular length normalization and coverage models, our model does not require training nor reranking the limited n-best outputs. Moreover, it is robust to large beam sizes, which is not well studied in previous work. On the Chinese-English and English-German translation tasks, our approach yields +0.4 1.5 BLEU improvements over the state-of-the-art baselines.
Anthology ID:
P18-2047
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
292–297
Language:
URL:
https://aclanthology.org/P18-2047
DOI:
10.18653/v1/P18-2047
Bibkey:
Cite (ACL):
Yanyang Li, Tong Xiao, Yinqiao Li, Qiang Wang, Changming Xu, and Jingbo Zhu. 2018. A Simple and Effective Approach to Coverage-Aware Neural Machine Translation. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 292–297, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
A Simple and Effective Approach to Coverage-Aware Neural Machine Translation (Li et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-2047.pdf
Note:
 P18-2047.Notes.pdf
Poster:
 P18-2047.Poster.pdf