SOTAVerified

Global Optimization under Length Constraint for Neural Text Summarization

2019-07-01ACL 2019Unverified0· sign in to hype

Takuya Makino, Tomoya Iwakura, Hiroya Takamura, Manabu Okumura

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a global optimization method under length constraint (GOLC) for neural text summarization models. GOLC increases the probabilities of generating summaries that have high evaluation scores, ROUGE in this paper, within a desired length. We compared GOLC with two optimization methods, a maximum log-likelihood and a minimum risk training, on CNN/Daily Mail and a Japanese single document summarization data set of The Mainichi Shimbun Newspapers. The experimental results show that a state-of-the-art neural summarization model optimized with GOLC generates fewer overlength summaries while maintaining the fastest processing speed; only 6.70\% overlength summaries on CNN/Daily and 7.8\% on long summary of Mainichi, compared to the approximately 20\% to 50\% on CNN/Daily Mail and 10\% to 30\% on Mainichi with the other optimization methods. We also demonstrate the importance of the generation of in-length summaries for post-editing with the dataset Mainich that is created with strict length constraints. The ex- perimental results show approximately 30\% to 40\% improved post-editing time by use of in-length summaries.

Tasks

Reproductions