SOTAVerified

Concept Pointer Network for Abstractive Summarization

2019-10-18IJCNLP 2019Code Available0· sign in to hype

Wang Wenbo, Gao Yang, Huang Heyan, Zhou Yuxiang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

A quality abstractive summary should not only copy salient source texts as summaries but should also tend to generate new conceptual words to express concrete details. Inspired by the popular pointer generator sequence-to-sequence model, this paper presents a concept pointer network for improving these aspects of abstractive summarization. The network leverages knowledge-based, context-aware conceptualizations to derive an extended set of candidate concepts. The model then points to the most appropriate choice using both the concept set and original source text. This joint approach generates abstractive summaries with higher-level semantic concepts. The training model is also optimized in a way that adapts to different data, which is based on a novel method of distantly-supervised learning guided by reference summaries and testing set. Overall, the proposed approach provides statistically significant improvements over several state-of-the-art models on both the DUC-2004 and Gigaword datasets. A human evaluation of the model's abstractive abilities also supports the quality of the summaries produced within this framework.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
GigaWordConcept pointer+RLROUGE-138.02Unverified
GigaWordConcept pointer+DSROUGE-137.01Unverified

Reproductions