Three Strategies to Improve One-to-Many Multilingual Translation

Yining Wang, Jiajun Zhang, Feifei Zhai, Jingfang Xu, Chengqing Zong


Abstract
Due to the benefits of model compactness, multilingual translation (including many-to-one, many-to-many and one-to-many) based on a universal encoder-decoder architecture attracts more and more attention. However, previous studies show that one-to-many translation based on this framework cannot perform on par with the individually trained models. In this work, we introduce three strategies to improve one-to-many multilingual translation by balancing the shared and unique features. Within the architecture of one decoder for all target languages, we first exploit the use of unique initial states for different target languages. Then, we employ language-dependent positional embeddings. Finally and especially, we propose to divide the hidden cells of the decoder into shared and language-dependent ones. The extensive experiments demonstrate that our proposed methods can obtain remarkable improvements over the strong baselines. Moreover, our strategies can achieve comparable or even better performance than the individually trained translation models.
Anthology ID:
D18-1326
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2955–2960
Language:
URL:
https://aclanthology.org/D18-1326
DOI:
10.18653/v1/D18-1326
Bibkey:
Cite (ACL):
Yining Wang, Jiajun Zhang, Feifei Zhai, Jingfang Xu, and Chengqing Zong. 2018. Three Strategies to Improve One-to-Many Multilingual Translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2955–2960, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Three Strategies to Improve One-to-Many Multilingual Translation (Wang et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1326.pdf