Towards Decoding as Continuous Optimisation in Neural Machine Translation

Cong Duy Vu Hoang, Gholamreza Haffari, Trevor Cohn


Abstract
We propose a novel decoding approach for neural machine translation (NMT) based on continuous optimisation. We reformulate decoding, a discrete optimization problem, into a continuous problem, such that optimization can make use of efficient gradient-based techniques. Our powerful decoding framework allows for more accurate decoding for standard neural machine translation models, as well as enabling decoding in intractable models such as intersection of several different NMT models. Our empirical results show that our decoding framework is effective, and can leads to substantial improvements in translations, especially in situations where greedy search and beam search are not feasible. Finally, we show how the technique is highly competitive with, and complementary to, reranking.
Anthology ID:
D17-1014
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
146–156
Language:
URL:
https://aclanthology.org/D17-1014
DOI:
10.18653/v1/D17-1014
Bibkey:
Cite (ACL):
Cong Duy Vu Hoang, Gholamreza Haffari, and Trevor Cohn. 2017. Towards Decoding as Continuous Optimisation in Neural Machine Translation. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 146–156, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Towards Decoding as Continuous Optimisation in Neural Machine Translation (Hoang et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1014.pdf
Attachment:
 D17-1014.Attachment.pdf
Video:
 https://aclanthology.org/D17-1014.mp4