Finding syntax in human encephalography with beam search

John Hale, Chris Dyer, Adhiguna Kuncoro, Jonathan Brennan


Abstract
Recurrent neural network grammars (RNNGs) are generative models of (tree , string ) pairs that rely on neural networks to evaluate derivational choices. Parsing with them using beam search yields a variety of incremental complexity metrics such as word surprisal and parser action count. When used as regressors against human electrophysiological responses to naturalistic text, they derive two amplitude effects: an early peak and a P600-like later peak. By contrast, a non-syntactic neural language model yields no reliable effects. Model comparisons attribute the early peak to syntactic composition within the RNNG. This pattern of results recommends the RNNG+beam search combination as a mechanistic model of the syntactic processing that occurs during normal human language comprehension.
Anthology ID:
P18-1254
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2727–2736
Language:
URL:
https://aclanthology.org/P18-1254
DOI:
10.18653/v1/P18-1254
Bibkey:
Cite (ACL):
John Hale, Chris Dyer, Adhiguna Kuncoro, and Jonathan Brennan. 2018. Finding syntax in human encephalography with beam search. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2727–2736, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Finding syntax in human encephalography with beam search (Hale et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-1254.pdf
Video:
 https://aclanthology.org/P18-1254.mp4