A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents
2018-04-16NAACL 2018Code Available1· sign in to hype
Arman Cohan, Franck Dernoncourt, Doo Soon Kim, Trung Bui, Seokhwan Kim, Walter Chang, Nazli Goharian
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/acohan/long-summarizationOfficialIn papertf★ 0
- github.com/AlexGidiotis/DANCER-summpytorch★ 18
- github.com/tongbao96/code-for-sfr-aspytorch★ 8
Abstract
Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| Arxiv HEP-TH citation graph | Discourse | ROUGE-1 | 35.8 | — | Unverified |
| Pubmed | Discourse | ROUGE-1 | 38.93 | — | Unverified |