SOTAVerified

Hierarchical VAE with a Diffusion-based VampPrior

2024-12-02Code Available0· sign in to hype

Anna Kuzina, Jakub M. Tomczak

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Deep hierarchical variational autoencoders (VAEs) are powerful latent variable generative models. In this paper, we introduce Hierarchical VAE with Diffusion-based Variational Mixture of the Posterior Prior (VampPrior). We apply amortization to scale the VampPrior to models with many stochastic layers. The proposed approach allows us to achieve better performance compared to the original VampPrior work and other deep hierarchical VAEs, while using fewer parameters. We empirically validate our method on standard benchmark datasets (MNIST, OMNIGLOT, CIFAR10) and demonstrate improved training stability and latent space utilization.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CIFAR-10DVP-VAENLL (bits/dim)2.73Unverified
MNISTDVP-VAENLL77.1Unverified
OMNIGLOTDVp-VAENLL89.07Unverified

Reproductions