Structural Embedding of Syntactic Trees for Machine Comprehension

Rui Liu, Junjie Hu, Wei Wei, Zi Yang, Eric Nyberg


Abstract
Deep neural networks for machine comprehension typically utilizes only word or character embeddings without explicitly taking advantage of structured linguistic information such as constituency trees and dependency trees. In this paper, we propose structural embedding of syntactic trees (SEST), an algorithm framework to utilize structured information and encode them into vector representations that can boost the performance of algorithms for the machine comprehension. We evaluate our approach using a state-of-the-art neural attention model on the SQuAD dataset. Experimental results demonstrate that our model can accurately identify the syntactic boundaries of the sentences and extract answers that are syntactically coherent over the baseline methods.
Anthology ID:
D17-1085
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
815–824
Language:
URL:
https://aclanthology.org/D17-1085
DOI:
10.18653/v1/D17-1085
Bibkey:
Cite (ACL):
Rui Liu, Junjie Hu, Wei Wei, Zi Yang, and Eric Nyberg. 2017. Structural Embedding of Syntactic Trees for Machine Comprehension. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 815–824, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Structural Embedding of Syntactic Trees for Machine Comprehension (Liu et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1085.pdf
Data
SQuAD