BIGPATENT: A Large-Scale Dataset for Abstractive and Coherent Summarization

Eva Sharma, Chen Li, Lu Wang


Abstract
Most existing text summarization datasets are compiled from the news domain, where summaries have a flattened discourse structure. In such datasets, summary-worthy content often appears in the beginning of input articles. Moreover, large segments from input articles are present verbatim in their respective summaries. These issues impede the learning and evaluation of systems that can understand an article’s global content structure as well as produce abstractive summaries with high compression ratio. In this work, we present a novel dataset, BIGPATENT, consisting of 1.3 million records of U.S. patent documents along with human written abstractive summaries. Compared to existing summarization datasets, BIGPATENT has the following properties: i) summaries contain a richer discourse structure with more recurring entities, ii) salient content is evenly distributed in the input, and iii) lesser and shorter extractive fragments are present in the summaries. Finally, we train and evaluate baselines and popular learning models on BIGPATENT to shed light on new challenges and motivate future directions for summarization research.
Anthology ID:
P19-1212
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2204–2213
Language:
URL:
https://aclanthology.org/P19-1212
DOI:
10.18653/v1/P19-1212
Bibkey:
Cite (ACL):
Eva Sharma, Chen Li, and Lu Wang. 2019. BIGPATENT: A Large-Scale Dataset for Abstractive and Coherent Summarization. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2204–2213, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
BIGPATENT: A Large-Scale Dataset for Abstractive and Coherent Summarization (Sharma et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1212.pdf
Data
BigPatentNEWSROOMNew York Times Annotated Corpus