Large-Scale Transfer Learning for Natural Language Generation
2019-07-01ACL 2019Code Available0· sign in to hype
Sergey Golovanov, Rauf Kurbanov, Sergey Nikolenko, Kyryl Truskovskyi, Alex Tselousov, er, Thomas Wolf
Code Available — Be the first to reproduce this paper.
ReproduceCode
Abstract
Large-scale pretrained language models define state of the art in natural language processing, achieving outstanding performance on a variety of tasks. We study how these architectures can be applied and adapted for natural language generation, comparing a number of architectural and training schemes. We focus in particular on open-domain dialog as a typical high entropy generation task, presenting and comparing different architectures for adapting pretrained models with state of the art results.