SOTAVerified

May the Force Be with Your Copy Mechanism: Enhanced Supervised-Copy Method for Natural Language Generation

2021-12-20arXiv 2021Unverified0· sign in to hype

Sanghyuk Choi, Jeong-in Hwang, Hyungjong Noh, Yeonsoo Lee

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Recent neural sequence-to-sequence models with a copy mechanism have achieved remarkable progress in various text generation tasks. These models addressed out-of-vocabulary problems and facilitated the generation of rare words. However, the identification of the word which needs to be copied is difficult, as observed by prior copy models, which suffer from incorrect generation and lacking abstractness. In this paper, we propose a novel supervised approach of a copy network that helps the model decide which words need to be copied and which need to be generated. Specifically, we re-define the objective function, which leverages source sequences and target vocabularies as guidance for copying. The experimental results on data-to-text generation and abstractive summarization tasks verify that our approach enhances the copying quality and improves the degree of abstractness.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
MLB DatasetForce-CopyBLEU10.5Unverified
MLB Dataset (Content Ordering)Force-CopyDLD21.16Unverified
MLB Dataset (Content Selection)Force-CopyPrecision49.39Unverified
MLB Dataset (Relation Generation)Force-CopyPrecision84.5Unverified
RotoWireForce-CopyBLEU17.26Unverified
RotoWire (Content Ordering)Force-CopyDLD17.26Unverified
Rotowire (Content Selection)Force-CopyPrecision34.34Unverified
RotoWire (Relation Generation)Force-CopyPrecision95.4Unverified

Reproductions