A Sequence-to-Sequence Model for Semantic Role Labeling

Angel Daza, Anette Frank


Abstract
We explore a novel approach for Semantic Role Labeling (SRL) by casting it as a sequence-to-sequence process. We employ an attention-based model enriched with a copying mechanism to ensure faithful regeneration of the input sequence, while enabling interleaved generation of argument role labels. We apply this model in a monolingual setting, performing PropBank SRL on English language data. The constrained sequence generation set-up enforced with the copying mechanism allows us to analyze the performance and special properties of the model on manually labeled data and benchmarking against state-of-the-art sequence labeling models. We show that our model is able to solve the SRL argument labeling task on English data, yet further structural decoding constraints will need to be added to make the model truly competitive. Our work represents the first step towards more advanced, generative SRL labeling setups.
Anthology ID:
W18-3027
Volume:
Proceedings of the Third Workshop on Representation Learning for NLP
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Isabelle Augenstein, Kris Cao, He He, Felix Hill, Spandana Gella, Jamie Kiros, Hongyuan Mei, Dipendra Misra
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
207–216
Language:
URL:
https://aclanthology.org/W18-3027
DOI:
10.18653/v1/W18-3027
Bibkey:
Cite (ACL):
Angel Daza and Anette Frank. 2018. A Sequence-to-Sequence Model for Semantic Role Labeling. In Proceedings of the Third Workshop on Representation Learning for NLP, pages 207–216, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
A Sequence-to-Sequence Model for Semantic Role Labeling (Daza & Frank, RepL4NLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-3027.pdf