SGNMT – A Flexible NMT Decoding Platform for Quick Prototyping of New Models and Search Strategies

Felix Stahlberg, Eva Hasler, Danielle Saunders, Bill Byrne


Abstract
This paper introduces SGNMT, our experimental platform for machine translation research. SGNMT provides a generic interface to neural and symbolic scoring modules (predictors) with left-to-right semantic such as translation models like NMT, language models, translation lattices, n-best lists or other kinds of scores and constraints. Predictors can be combined with other predictors to form complex decoding tasks. SGNMT implements a number of search strategies for traversing the space spanned by the predictors which are appropriate for different predictor constellations. Adding new predictors or decoding strategies is particularly easy, making it a very efficient tool for prototyping new research ideas. SGNMT is actively being used by students in the MPhil program in Machine Learning, Speech and Language Technology at the University of Cambridge for course work and theses, as well as for most of the research work in our group.
Anthology ID:
D17-2005
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Lucia Specia, Matt Post, Michael Paul
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
25–30
Language:
URL:
https://aclanthology.org/D17-2005
DOI:
10.18653/v1/D17-2005
Bibkey:
Cite (ACL):
Felix Stahlberg, Eva Hasler, Danielle Saunders, and Bill Byrne. 2017. SGNMT – A Flexible NMT Decoding Platform for Quick Prototyping of New Models and Search Strategies. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 25–30, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
SGNMT – A Flexible NMT Decoding Platform for Quick Prototyping of New Models and Search Strategies (Stahlberg et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-2005.pdf