SOTAVerified

A Unified and Efficient Contrastive Learning Framework for Text Summarization

2021-11-16ACL ARR November 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Both extractive and abstractive summarization systems share a common problem, i.e., there is a mismatch between the training object and evaluation metrics. To bridge this gap, we introduce a unified and efficient contrastive learning framework called UniCLS for both extractive and abstractive methods. In this framework, the summarization model not only learns how to write summaries but also is required to evaluate the generated summary. Without any additional parameters, our contrastive learning approach can be effortlessly applied to any extractive and abstractive summarization systems. Extensive experiments show that our framework brings substantial improvements on a wide range of datasets and clearly outperforms previous state-of-the-art end-to-end systems by a large margin on CNN/DailyMail benchmark. In Particular, UniCLS achieves an improvement of 1.75 ROUGE-1 score over the BART-large model without sacrificing inference efficiency.

Tasks

Reproductions