A Shared Attention Mechanism for Interpretation of Neural Automatic Post-Editing Systems

Inigo Jauregi Unanue, Ehsan Zare Borzeshi, Massimo Piccardi


Abstract
Automatic post-editing (APE) systems aim to correct the systematic errors made by machine translators. In this paper, we propose a neural APE system that encodes the source (src) and machine translated (mt) sentences with two separate encoders, but leverages a shared attention mechanism to better understand how the two inputs contribute to the generation of the post-edited (pe) sentences. Our empirical observations have showed that when the mt is incorrect, the attention shifts weight toward tokens in the src sentence to properly edit the incorrect translation. The model has been trained and evaluated on the official data from the WMT16 and WMT17 APE IT domain English-German shared tasks. Additionally, we have used the extra 500K artificial data provided by the shared task. Our system has been able to reproduce the accuracies of systems trained with the same data, while at the same time providing better interpretability.
Anthology ID:
W18-2702
Volume:
Proceedings of the 2nd Workshop on Neural Machine Translation and Generation
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Alexandra Birch, Andrew Finch, Thang Luong, Graham Neubig, Yusuke Oda
Venue:
NGT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11–17
Language:
URL:
https://aclanthology.org/W18-2702
DOI:
10.18653/v1/W18-2702
Bibkey:
Cite (ACL):
Inigo Jauregi Unanue, Ehsan Zare Borzeshi, and Massimo Piccardi. 2018. A Shared Attention Mechanism for Interpretation of Neural Automatic Post-Editing Systems. In Proceedings of the 2nd Workshop on Neural Machine Translation and Generation, pages 11–17, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
A Shared Attention Mechanism for Interpretation of Neural Automatic Post-Editing Systems (Jauregi Unanue et al., NGT 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-2702.pdf
Code
 ijauregiCMCRC/Shared_Attention_for_APE
Data
WMT 2016