Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization
2016-12-31EACL 2017Unverified0· sign in to hype
Jun Suzuki, Masaaki Nagata
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| DUC 2004 Task 1 | EndDec+WFE | ROUGE-1 | 32.28 | — | Unverified |
| GigaWord | EndDec+WFE | ROUGE-1 | 36.3 | — | Unverified |