YNU Deep at SemEval-2018 Task 12: A BiLSTM Model with Neural Attention for Argument Reasoning Comprehension

Peng Ding, Xiaobing Zhou


Abstract
This paper describes the system submitted to SemEval-2018 Task 12 (The Argument Reasoning Comprehension Task). Enabling a computer to understand a text so that it can answer comprehension questions is still a challenging goal of NLP. We propose a Bidirectional LSTM (BiLSTM) model that reads two sentences separated by a delimiter to determine which warrant is correct. We extend this model with a neural attention mechanism that encourages the model to make reasoning over the given claims and reasons. Officially released results show that our system ranks 6th among 22 submissions to this task.
Anthology ID:
S18-1189
Volume:
Proceedings of the 12th International Workshop on Semantic Evaluation
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marianna Apidianaki, Saif M. Mohammad, Jonathan May, Ekaterina Shutova, Steven Bethard, Marine Carpuat
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1120–1123
Language:
URL:
https://aclanthology.org/S18-1189
DOI:
10.18653/v1/S18-1189
Bibkey:
Cite (ACL):
Peng Ding and Xiaobing Zhou. 2018. YNU Deep at SemEval-2018 Task 12: A BiLSTM Model with Neural Attention for Argument Reasoning Comprehension. In Proceedings of the 12th International Workshop on Semantic Evaluation, pages 1120–1123, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
YNU Deep at SemEval-2018 Task 12: A BiLSTM Model with Neural Attention for Argument Reasoning Comprehension (Ding & Zhou, SemEval 2018)
Copy Citation:
PDF:
https://aclanthology.org/S18-1189.pdf