Reference Network for Neural Machine Translation

Han Fu, Chenghao Liu, Jianling Sun


Abstract
Neural Machine Translation (NMT) has achieved notable success in recent years. Such a framework usually generates translations in isolation. In contrast, human translators often refer to reference data, either rephrasing the intricate sentence fragments with common terms in source language, or just accessing to the golden translation directly. In this paper, we propose a Reference Network to incorporate referring process into translation decoding of NMT. To construct a reference book, an intuitive way is to store the detailed translation history with extra memory, which is computationally expensive. Instead, we employ Local Coordinates Coding (LCC) to obtain global context vectors containing monolingual and bilingual contextual information for NMT decoding. Experimental results on Chinese-English and English-German tasks demonstrate that our proposed model is effective in improving the translation quality with lightweight computation cost.
Anthology ID:
P19-1287
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3002–3012
Language:
URL:
https://aclanthology.org/P19-1287
DOI:
10.18653/v1/P19-1287
Bibkey:
Cite (ACL):
Han Fu, Chenghao Liu, and Jianling Sun. 2019. Reference Network for Neural Machine Translation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3002–3012, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Reference Network for Neural Machine Translation (Fu et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1287.pdf