SOTAVerified

Learning Energy-Based Generative Models via Coarse-to-Fine Expanding and Sampling

2021-01-01ICLR 2021Unverified0· sign in to hype

Yang Zhao, Jianwen Xie, Ping Li

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Energy-based models (EBMs) for generative modeling parametrize a single net and can be directly trained by maximum likelihood estimation. Despite the simplicity and tractability, current approaches are either unstable to learn or difficult to synthesize diverse and high-fidelity images. We propose to train EBM via a multistage coarse-to-fine expanding and sampling strategy, namely CF-EBM. For the purpose of improving the learning procedure, we construct an effective net architecture and advocate applying smooth activations. The resulting approach is shown to be computationally efficient and achieves the best performance on image generation amongst EBMs and the spectral normalization GAN. Furthermore, we provide a recipe for being the first successful EBM to synthesize 256256-pixel images. In the end, we effortlessly generalize CF-EBM to the one-sided unsupervised image-to-image translation and beat baseline methods with the model size reduced by 1000 and the training budget by 9. In parallel, we present a gradient-based discriminative saliency method to explicitly interpret the translation dynamics which align with human behavior.

Tasks

Reproductions