SOTAVerified

CATS: Customizable Abstractive Topic-based Summarization

2019-05-24Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Neural sequence-to-sequence models are a recently proposed family of approaches used in abstractive summarization of text documents, useful for producing condensed versions of source text narratives without being restricted to using only words from the original text. Despite the advances in abstractive summarization, custom generation of summaries (e.g. towards a user's preference) remains unexplored. In this paper, we present CATS, an abstractive neural summarization model, that summarizes content in a sequence-to-sequence fashion but also introduces a new mechanism to control the underlying latent topic distribution of the produced summaries. Our experimental results on the well-known CNN/DailyMail dataset show that our model achieves state-of-the-art performance.

Tasks

Reproductions