Transition-based Neural RST Parsing with Implicit Syntax Features

Nan Yu, Meishan Zhang, Guohong Fu


Abstract
Syntax has been a useful source of information for statistical RST discourse parsing. Under the neural setting, a common approach integrates syntax by a recursive neural network (RNN), requiring discrete output trees produced by a supervised syntax parser. In this paper, we propose an implicit syntax feature extraction approach, using hidden-layer vectors extracted from a neural syntax parser. In addition, we propose a simple transition-based model as the baseline, further enhancing it with dynamic oracle. Experiments on the standard dataset show that our baseline model with dynamic oracle is highly competitive. When implicit syntax features are integrated, we are able to obtain further improvements, better than using explicit Tree-RNN.
Anthology ID:
C18-1047
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
559–570
Language:
URL:
https://aclanthology.org/C18-1047
DOI:
Bibkey:
Cite (ACL):
Nan Yu, Meishan Zhang, and Guohong Fu. 2018. Transition-based Neural RST Parsing with Implicit Syntax Features. In Proceedings of the 27th International Conference on Computational Linguistics, pages 559–570, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Transition-based Neural RST Parsing with Implicit Syntax Features (Yu et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1047.pdf
Code
 fajri91/NeuralRST
Data
RST-DT