Neural Finite-State Transducers: Beyond Rational Relations

Chu-Cheng Lin, Hao Zhu, Matthew R. Gormley, Jason Eisner


Abstract
We introduce neural finite state transducers (NFSTs), a family of string transduction models defining joint and conditional probability distributions over pairs of strings. The probability of a string pair is obtained by marginalizing over all its accepting paths in a finite state transducer. In contrast to ordinary weighted FSTs, however, each path is scored using an arbitrary function such as a recurrent neural network, which breaks the usual conditional independence assumption (Markov property). NFSTs are more powerful than previous finite-state models with neural features (Rastogi et al., 2016.) We present training and inference algorithms for locally and globally normalized variants of NFSTs. In experiments on different transduction tasks, they compete favorably against seq2seq models while offering interpretable paths that correspond to hard monotonic alignments.
Anthology ID:
N19-1024
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
272–283
Language:
URL:
https://aclanthology.org/N19-1024
DOI:
10.18653/v1/N19-1024
Bibkey:
Cite (ACL):
Chu-Cheng Lin, Hao Zhu, Matthew R. Gormley, and Jason Eisner. 2019. Neural Finite-State Transducers: Beyond Rational Relations. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 272–283, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Neural Finite-State Transducers: Beyond Rational Relations (Lin et al., NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1024.pdf
Supplementary:
 N19-1024.Supplementary.pdf