Hybrid Neural Network Alignment and Lexicon Model in Direct HMM for Statistical Machine Translation

Weiyue Wang, Tamer Alkhouli, Derui Zhu, Hermann Ney


Abstract
Recently, the neural machine translation systems showed their promising performance and surpassed the phrase-based systems for most translation tasks. Retreating into conventional concepts machine translation while utilizing effective neural models is vital for comprehending the leap accomplished by neural machine translation over phrase-based methods. This work proposes a direct HMM with neural network-based lexicon and alignment models, which are trained jointly using the Baum-Welch algorithm. The direct HMM is applied to rerank the n-best list created by a state-of-the-art phrase-based translation system and it provides improvements by up to 1.0% Bleu scores on two different translation tasks.
Anthology ID:
P17-2020
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
125–131
Language:
URL:
https://aclanthology.org/P17-2020
DOI:
10.18653/v1/P17-2020
Bibkey:
Cite (ACL):
Weiyue Wang, Tamer Alkhouli, Derui Zhu, and Hermann Ney. 2017. Hybrid Neural Network Alignment and Lexicon Model in Direct HMM for Statistical Machine Translation. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 125–131, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Hybrid Neural Network Alignment and Lexicon Model in Direct HMM for Statistical Machine Translation (Wang et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-2020.pdf
Video:
 https://aclanthology.org/P17-2020.mp4