SOTAVerified

Latent-IMH: Efficient Bayesian Inference for Inverse Problems with Approximate Operators

2026-03-04Unverified0· sign in to hype

Youguang Chen, George Biros

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We study sampling from posterior distributions in Bayesian linear inverse problems where A, the parameters to observables operator, is computationally expensive. In many applications, A can be factored in a manner that facilitates the construction of a cost-effective approximation A. In this framework, we introduce Latent-IMH, a sampling method based on the Metropolis-Hastings independence (IMH) sampler. Latent-IMH first generates intermediate latent variables using the approximate A, and then refines them using the exact A. Its primary benefit is that it shifts the computational cost to an offline phase. We theoretically analyze the performance of Latent-IMH using KL divergence and mixing time bounds. Using numerical experiments on several model problems, we show that, under reasonable assumptions, it outperforms state-of-the-art methods such as the No-U-Turn sampler (NUTS) in computational efficiency. In some cases, Latent-IMH can be orders of magnitude faster than existing schemes.

Reproductions