SOTAVerified

Deep dive into CoCon - A Self Supervised approach for Controlled Text Generation

2022-01-17ICLR Track Blog 2022Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Transformer-based language models ([1] Vaswani et.al, 2017) have stirred transfer-based learning in NLP and have improved the performance of several NLP tasks. The preliminary step involves pretraining a language model on a large amount of text on the web. Research on steering a pretrained language model to enable fine-grained control over the content and sentiment of output is still under active exploration and has great potential in various applications such as story generation, search engines, etc. This blog post discusses a paper which proposes a content conditioner when trained auto-regressively alongside a Large pretrained language model (like GPT-2) provides the capability to control text at a fine-grained level.

Tasks

Reproductions