Multitask Parsing Across Semantic Representations

Daniel Hershcovich, Omri Abend, Ari Rappoport


Abstract
The ability to consolidate information of different types is at the core of intelligence, and has tremendous practical value in allowing learning for one task to benefit from generalizations learned for others. In this paper we tackle the challenging task of improving semantic parsing performance, taking UCCA parsing as a test case, and AMR, SDP and Universal Dependencies (UD) parsing as auxiliary tasks. We experiment on three languages, using a uniform transition-based system and learning architecture for all parsing tasks. Despite notable conceptual, formal and domain differences, we show that multitask learning significantly improves UCCA parsing in both in-domain and out-of-domain settings.
Anthology ID:
P18-1035
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
373–385
Language:
URL:
https://aclanthology.org/P18-1035
DOI:
10.18653/v1/P18-1035
Bibkey:
Cite (ACL):
Daniel Hershcovich, Omri Abend, and Ari Rappoport. 2018. Multitask Parsing Across Semantic Representations. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 373–385, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Multitask Parsing Across Semantic Representations (Hershcovich et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-1035.pdf
Note:
 P18-1035.Notes.pdf
Software:
 P18-1035.Software.zip
Poster:
 P18-1035.Poster.pdf
Code
 danielhers/tupa
Data
Universal Dependencies