SOTAVerified

Using Structured Content Plans for Fine-grained Syntactic Control in Pretrained Language Model Generation

2022-10-01COLING 2022Unverified0· sign in to hype

Fei-Tzin Lee, Miguel Ballesteros, Feng Nan, Kathleen McKeown

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Large pretrained language models offer powerful generation capabilities, but cannot be reliably controlled at a sub-sentential level. We propose to make such fine-grained control possible in pretrained LMs by generating text directly from a semantic representation, Abstract Meaning Representation (AMR), which is augmented at the node level with syntactic control tags. We experiment with English-language generation of three modes of syntax relevant to the framing of a sentence - verb voice, verb tense, and realization of human entities - and demonstrate that they can be reliably controlled, even in settings that diverge drastically from the training distribution. These syntactic aspects contribute to how information is framed in text, something that is important for applications such as summarization which aim to highlight salient information.

Tasks

Reproductions