SOTAVerified

Neural Extractive Text Summarization with Syntactic Compression

2019-02-03IJCNLP 2019Code Available0· sign in to hype

Jiacheng Xu, Greg Durrett

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recent neural network approaches to summarization are largely either selection-based extraction or generation-based abstraction. In this work, we present a neural model for single-document summarization based on joint extraction and syntactic compression. Our model chooses sentences from the document, identifies possible compressions based on constituency parses, and scores those compressions with a neural model to produce the final summary. For learning, we construct oracle extractive-compressive summaries, then learn both of our components jointly with this supervision. Experimental results on the CNN/Daily Mail and New York Times datasets show that our model achieves strong performance (comparable to state-of-the-art systems) as evaluated by ROUGE. Moreover, our approach outperforms an off-the-shelf compression module, and human and manual evaluation shows that our model's output generally remains grammatical.

Tasks

Reproductions