Crowdsourcing discourse interpretations: On the influence of context and the reliability of a connective insertion task

Merel Scholman, Vera Demberg


Abstract
Traditional discourse annotation tasks are considered costly and time-consuming, and the reliability and validity of these tasks is in question. In this paper, we investigate whether crowdsourcing can be used to obtain reliable discourse relation annotations. We also examine the influence of context on the reliability of the data. The results of a crowdsourced connective insertion task showed that the method can be used to obtain reliable annotations: The majority of the inserted connectives converged with the original label. Further, the method is sensitive to the fact that multiple senses can often be inferred for a single relation. Regarding the presence of context, the results show no significant difference in distributions of insertions between conditions overall. However, a by-item comparison revealed several characteristics of segments that determine whether the presence of context makes a difference in annotations. The findings discussed in this paper can be taken as evidence that crowdsourcing can be used as a valuable method to obtain insights into the sense(s) of relations.
Anthology ID:
W17-0803
Volume:
Proceedings of the 11th Linguistic Annotation Workshop
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Nathan Schneider, Nianwen Xue
Venue:
LAW
SIG:
SIGANN
Publisher:
Association for Computational Linguistics
Note:
Pages:
24–33
Language:
URL:
https://aclanthology.org/W17-0803
DOI:
10.18653/v1/W17-0803
Bibkey:
Cite (ACL):
Merel Scholman and Vera Demberg. 2017. Crowdsourcing discourse interpretations: On the influence of context and the reliability of a connective insertion task. In Proceedings of the 11th Linguistic Annotation Workshop, pages 24–33, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
Crowdsourcing discourse interpretations: On the influence of context and the reliability of a connective insertion task (Scholman & Demberg, LAW 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-0803.pdf