SOTAVerified

Continuation is a Sub-Task of Fill in the Blank: Why Not Train for Both?

2021-06-16ACL ARR Jun 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The task of inserting text into a specified position in a passage, known as fill in the blank, is useful for a variety of applications where writers interact with a natural language generation (NLG) system to craft text. However, NLG research has mostly focused on continuation models that append text to the end of a passage. Since continuation is in fact a sub-task of fill in the blank, one where the blank is placed at the sequence's end, we propose the training of a single model which can effectively handle both these tasks. The result is improved efficiency---as only one model needs to be maintained---without any negative impact on performance at either task.

Tasks

Reproductions