SOTAVerified

Sequentially Controlled Text Generation

2023-01-05Unverified0· sign in to hype

Alexander Spangher, Xinyu Hua, Yao Ming, Nanyun Peng

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

While GPT-2 generates sentences that are remarkably human-like, longer documents can ramble and do not follow human-like writing structure. We study the problem of imposing structure on long-range text. We propose a novel controlled text generation task, sequentially controlled text generation, and identify a dataset, NewsDiscourse as a starting point for this task. We develop a sequential controlled text generation pipeline with generation and editing. We test different degrees of structural awareness and show that, in general, more structural awareness results in higher control-accuracy, grammaticality, coherency and topicality, approaching human-level writing performance.

Tasks

Reproductions