SOTAVerified

Sentence-level Planning for Especially Abstractive Summarization

2021-11-01EMNLP (newsum) 2021Code Available0· sign in to hype

Andreas Marfurt, James Henderson

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Abstractive summarization models heavily rely on copy mechanisms, such as the pointer network or attention, to achieve good performance, measured by textual overlap with reference summaries. As a result, the generated summaries stay close to the formulations in the source document. We propose the *sentence planner* model to generate more abstractive summaries. It includes a hierarchical decoder that first generates a representation for the next summary sentence, and then conditions the word generator on this representation. Our generated summaries are more abstractive and at the same time achieve high ROUGE scores when compared to human reference summaries. We verify the effectiveness of our design decisions with extensive evaluations.

Tasks

Reproductions