On Tree-Based Neural Sentence Modeling

Haoyue Shi, Hao Zhou, Jiaze Chen, Lei Li


Abstract
Neural networks with tree-based sentence encoders have shown better results on many downstream tasks. Most of existing tree-based encoders adopt syntactic parsing trees as the explicit structure prior. To study the effectiveness of different tree structures, we replace the parsing trees with trivial trees (i.e., binary balanced tree, left-branching tree and right-branching tree) in the encoders. Though trivial trees contain no syntactic information, those encoders get competitive or even better results on all of the ten downstream tasks we investigated. This surprising result indicates that explicit syntax guidance may not be the main contributor to the superior performances of tree-based neural sentence modeling. Further analysis show that tree modeling gives better results when crucial words are closer to the final representation. Additional experiments give more clues on how to design an effective tree-based encoder. Our code is open-source and available at https://github.com/ExplorerFreda/TreeEnc.
Anthology ID:
D18-1492
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4631–4641
Language:
URL:
https://aclanthology.org/D18-1492
DOI:
10.18653/v1/D18-1492
Bibkey:
Cite (ACL):
Haoyue Shi, Hao Zhou, Jiaze Chen, and Lei Li. 2018. On Tree-Based Neural Sentence Modeling. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4631–4641, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
On Tree-Based Neural Sentence Modeling (Shi et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1492.pdf
Attachment:
 D18-1492.Attachment.zip
Code
 ExplorerFreda/TreeEnc
Data
AG NewsDBpediaSNLI