On the Anatomy of Latent-variable Generative Models for Conditional Text Generation
Anonymous
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Conditional text generation is a non-trivial task, which is until now predominantly performed with latent-variable generative models. In this work, we intend to explore several choices that are shown to affect the two essential aspects of model performance: expressivity and controllability. We propose to experiment with a series of latent-variable models built around simple design changes under a general unified framework, with a particular focus on prior distributions based on Energy-Based Models instead of the usual standard Gaussian. Our experiments validate the claim that this richer prior allows for a better representational power, but it exhibits difficult training. We provide a comprehensive analysis of these difficulties and a close comparison with recent work on EBM-based priors for conditional text generation.