Encoding Gated Translation Memory into Neural Machine Translation

Qian Cao, Deyi Xiong


Abstract
Translation memories (TM) facilitate human translators to reuse existing repetitive translation fragments. In this paper, we propose a novel method to combine the strengths of both TM and neural machine translation (NMT) for high-quality translation. We treat the target translation of a TM match as an additional reference input and encode it into NMT with an extra encoder. A gating mechanism is further used to balance the impact of the TM match on the NMT decoder. Experiment results on the UN corpus demonstrate that when fuzzy matches are higher than 50%, the quality of NMT translation can be significantly improved by over 10 BLEU points.
Anthology ID:
D18-1340
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3042–3047
Language:
URL:
https://aclanthology.org/D18-1340
DOI:
10.18653/v1/D18-1340
Bibkey:
Cite (ACL):
Qian Cao and Deyi Xiong. 2018. Encoding Gated Translation Memory into Neural Machine Translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3042–3047, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Encoding Gated Translation Memory into Neural Machine Translation (Cao & Xiong, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1340.pdf