SOTAVerified

Generating Diverse and High-Quality Abstractive Summaries with Variational Transformers

2021-11-16ACL ARR November 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Existing works on abstractive summarization mainly focus on boosting summarization's quality (informativeness, contextual similarity). To generate summaries of both high diversity and quality, we proposes the Transformer+CVAE model, which integrates the CVAE framework into the Transformer by introducing the prior/recognition networks that bridges the Transformer encoder and decoder. We utilize the latent variables generated in the global receptive field of the transformer by fusing them to the starting-of-sequence ([SOS]) of the decoder inputs. To better tune the weights of the latent variables in the sequence, we designed a gated unit to blend the latent representation and the [SOS] token. Evaluated on the Gigaword dataset, our model outperforms the state-of-the-art seq-to-seq models and the base Transformer in diversity and quality metrics. After scrutinizing the pre-training and the gating mechanism we apply, we discover that both schemes help improve the quality of generated summaries in the CVAE framework.

Tasks

Reproductions