SOTAVerified

Controlled Text Generation for Black-box Language Models via Score-based Progressive Editor

2023-11-13Code Available0· sign in to hype

Sangwon Yu, Changmin Lee, Hojin Lee, Sungroh Yoon

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Controlled text generation is very important for the practical use of language models because it ensures that the produced text includes only the desired attributes from a specific domain or dataset. Existing methods, however, are inapplicable to black-box models or suffer a significant trade-off between controlling the generated text and maintaining its fluency. This paper introduces the Score-based Progressive Editor (ScoPE), a novel approach designed to overcome these issues. ScoPE modifies the context at the token level during the generation process of a backbone language model. This modification guides the subsequent text to naturally include the target attributes. To facilitate this process, ScoPE employs a training objective that maximizes a target score, thoroughly considering both the ability to guide the text and its fluency. Experimental results on diverse controlled generation tasks demonstrate that ScoPE can effectively regulate the attributes of the generated text while fully utilizing the capability of the backbone large language models. Our codes are available at https://github.com/ysw1021/ScoPE.

Tasks

Reproductions