The Covert Helps Parse the Overt

Xun Zhang, Weiwei Sun, Xiaojun Wan


Abstract
This paper is concerned with whether deep syntactic information can help surface parsing, with a particular focus on empty categories. We design new algorithms to produce dependency trees in which empty elements are allowed, and evaluate the impact of information about empty category on parsing overt elements. Such information is helpful to reduce the approximation error in a structured parsing model, but increases the search space for inference and accordingly the estimation error. To deal with structure-based overfitting, we propose to integrate disambiguation models with and without empty elements, and perform structure regularization via joint decoding. Experiments on English and Chinese TreeBanks with different parsing models indicate that incorporating empty elements consistently improves surface parsing.
Anthology ID:
K17-1035
Volume:
Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)
Month:
August
Year:
2017
Address:
Vancouver, Canada
Editors:
Roger Levy, Lucia Specia
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
343–353
Language:
URL:
https://aclanthology.org/K17-1035
DOI:
10.18653/v1/K17-1035
Bibkey:
Cite (ACL):
Xun Zhang, Weiwei Sun, and Xiaojun Wan. 2017. The Covert Helps Parse the Overt. In Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017), pages 343–353, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
The Covert Helps Parse the Overt (Zhang et al., CoNLL 2017)
Copy Citation:
PDF:
https://aclanthology.org/K17-1035.pdf