SOTAVerified

Variational Inference for Latent Variable Models in High Dimensions

2025-06-02Unverified0· sign in to hype

Chenyang Zhong, Sumit Mukherjee, Bodhisattva Sen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Variational inference (VI) is a popular method for approximating intractable posterior distributions in Bayesian inference and probabilistic machine learning. In this paper, we introduce a general framework for quantifying the statistical accuracy of mean-field variational inference (MFVI) for posterior approximation in Bayesian latent variable models with categorical local latent variables. Utilizing our general framework, we capture the exact asymptotic regime where MFVI `works' for the celebrated latent Dirichlet allocation (LDA) model. Focusing on the mixed membership stochastic blockmodel (MMSB), we show that the vanilla fully factorized MFVI, often used in the literature, is suboptimal. We propose a partially grouped VI algorithm for this model and show that it works, and derive its exact asymptotic performance. We further illustrate that our bounds are tight for both the above models.

Tasks

Reproductions