Modern Methods for Text Generation
2020-09-10Code Available1· sign in to hype
Dimas Munoz Montesinos
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/DimasDMM/nlp-completerpytorch★ 12
- github.com/DimasDMM/transformerspytorch★ 12
Abstract
Synthetic text generation is challenging and has limited success. Recently, a new architecture, called Transformers, allow machine learning models to understand better sequential data, such as translation or summarization. BERT and GPT-2, using Transformers in their cores, have shown a great performance in tasks such as text classification, translation and NLI tasks. In this article, we analyse both algorithms and compare their output quality in text generation tasks.