SOTAVerified

A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents

2018-04-16NAACL 2018Code Available1· sign in to hype

Arman Cohan, Franck Dernoncourt, Doo Soon Kim, Trung Bui, Seokhwan Kim, Walter Chang, Nazli Goharian

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Arxiv HEP-TH citation graphDiscourseROUGE-135.8Unverified
PubmedDiscourseROUGE-138.93Unverified

Reproductions