SOTAVerified

Near-Optimal Approximations for Bayesian Inference in Function Space

2025-02-25Code Available0· sign in to hype

Veit Wild, James Wu, Dino Sejdinovic, Jeremias Knoblauch

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose a scalable inference algorithm for Bayes posteriors defined on a reproducing kernel Hilbert space (RKHS). Given a likelihood function and a Gaussian random element representing the prior, the corresponding Bayes posterior measure _B can be obtained as the stationary distribution of an RKHS-valued Langevin diffusion. We approximate the infinite-dimensional Langevin diffusion via a projection onto the first M components of the Kosambi-Karhunen-Lo\`eve expansion. Exploiting the thus obtained approximate posterior for these M components, we perform inference for _B by relying on the law of total probability and a sufficiency assumption. The resulting method scales as O(M^3+JM^2), where J is the number of samples produced from the posterior measure _B. Interestingly, the algorithm recovers the posterior arising from the sparse variational Gaussian process (SVGP) (see Titsias, 2009) as a special case, owed to the fact that the sufficiency assumption underlies both methods. However, whereas the SVGP is parametrically constrained to be a Gaussian process, our method is based on a non-parametric variational family P(R^M) consisting of all probability measures on R^M. As a result, our method is provably close to the optimal M-dimensional variational approximation of the Bayes posterior _B in P(R^M) for convex and Lipschitz continuous negative log likelihoods, and coincides with SVGP for the special case of a Gaussian error likelihood.

Tasks

Reproductions