A Neural Attention Model for Abstractive Sentence Summarization
2015-09-02EMNLP 2015Code Available0· sign in to hype
Alexander M. Rush, Sumit Chopra, Jason Weston
Code Available — Be the first to reproduce this paper.
ReproduceCode
Abstract
Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| DUC 2004 Task 1 | Abs | ROUGE-1 | 26.55 | — | Unverified |