Using Context Events in Neural Network Models for Event Temporal Status Identification

Zeyu Dai, Wenlin Yao, Ruihong Huang


Abstract
Focusing on the task of identifying event temporal status, we find that events directly or indirectly governing the target event in a dependency tree are most important contexts. Therefore, we extract dependency chains containing context events and use them as input in neural network models, which consistently outperform previous models using local context words as input. Visualization verifies that the dependency chain representation can effectively capture the context events which are closely related to the target event and play key roles in predicting event temporal status.
Anthology ID:
I17-2040
Volume:
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
November
Year:
2017
Address:
Taipei, Taiwan
Editors:
Greg Kondrak, Taro Watanabe
Venue:
IJCNLP
SIG:
Publisher:
Asian Federation of Natural Language Processing
Note:
Pages:
234–239
Language:
URL:
https://aclanthology.org/I17-2040
DOI:
Bibkey:
Cite (ACL):
Zeyu Dai, Wenlin Yao, and Ruihong Huang. 2017. Using Context Events in Neural Network Models for Event Temporal Status Identification. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 234–239, Taipei, Taiwan. Asian Federation of Natural Language Processing.
Cite (Informal):
Using Context Events in Neural Network Models for Event Temporal Status Identification (Dai et al., IJCNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/I17-2040.pdf