Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm
Qiang Liu, Dilin Wang
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/DartML/Stein-Variational-Gradient-DescentOfficialIn papertf★ 0
- github.com/larslorch/dibsjax★ 52
- github.com/activatedgeek/stein-gradientpytorch★ 48
- github.com/lucadellalib/bdl-rul-svgdpytorch★ 29
- github.com/aleatory-science/smi_experimentsjax★ 9
- github.com/louissharrock/coin-svgdpytorch★ 6
- github.com/feynmanliang/dist-svgdpytorch★ 0
- github.com/LMikeH/ocbnn-lmikhpytorch★ 0
- github.com/lpodl/Stein-Variational-Gradient-Descendnone★ 0
- github.com/MindSpore-scientific-2/code-11/tree/main/SVGDmindspore★ 0
Abstract
We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization. Our method iteratively transports a set of particles to match the target distribution, by applying a form of functional gradient descent that minimizes the KL divergence. Empirical studies are performed on various real world models and datasets, on which our method is competitive with existing state-of-the-art methods. The derivation of our method is based on a new theoretical result that connects the derivative of KL divergence under smooth transforms with Stein's identity and a recently proposed kernelized Stein discrepancy, which is of independent interest.