Linguistically-Based Deep Unstructured Question Answering

Ahmad Aghaebrahimian


Abstract
In this paper, we propose a new linguistically-based approach to answering non-factoid open-domain questions from unstructured data. First, we elaborate on an architecture for textual encoding based on which we introduce a deep end-to-end neural model. This architecture benefits from a bilateral attention mechanism which helps the model to focus on a question and the answer sentence at the same time for phrasal answer extraction. Second, we feed the output of a constituency parser into the model directly and integrate linguistic constituents into the network to help it concentrate on chunks of an answer rather than on its single words for generating more natural output. By optimizing this architecture, we managed to obtain near-to-human-performance results and competitive to a state-of-the-art system on SQuAD and MS-MARCO datasets respectively.
Anthology ID:
K18-1042
Volume:
Proceedings of the 22nd Conference on Computational Natural Language Learning
Month:
October
Year:
2018
Address:
Brussels, Belgium
Editors:
Anna Korhonen, Ivan Titov
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
433–443
Language:
URL:
https://aclanthology.org/K18-1042
DOI:
10.18653/v1/K18-1042
Bibkey:
Cite (ACL):
Ahmad Aghaebrahimian. 2018. Linguistically-Based Deep Unstructured Question Answering. In Proceedings of the 22nd Conference on Computational Natural Language Learning, pages 433–443, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Linguistically-Based Deep Unstructured Question Answering (Aghaebrahimian, CoNLL 2018)
Copy Citation:
PDF:
https://aclanthology.org/K18-1042.pdf
Data
MS MARCOSQuAD