SOTAVerified

INSET: Sentence Infilling with INter-SEntential Transformer

2019-11-10ACL 2020Code Available0· sign in to hype

Yichen Huang, Yizhe Zhang, Oussama Elachqar, Yu Cheng

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Missing sentence generation (or sentence infilling) fosters a wide range of applications in natural language generation, such as document auto-completion and meeting note expansion. This task asks the model to generate intermediate missing sentences that can syntactically and semantically bridge the surrounding context. Solving the sentence infilling task requires techniques in natural language processing ranging from understanding to discourse-level planning to generation. In this paper, we propose a framework to decouple the challenge and address these three aspects respectively, leveraging the power of existing large-scale pre-trained models such as BERT and GPT-2. We empirically demonstrate the effectiveness of our model in learning a sentence representation for generation and further generating a missing sentence that fits the context.

Tasks

Reproductions