SOTAVerified

Accelerating Convergence of Stein Variational Gradient Descent via Deep Unfolding

2024-02-23Code Available0· sign in to hype

Yuya Kawamura, Satoshi Takabe

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Stein variational gradient descent (SVGD) is a prominent particle-based variational inference method used for sampling a target distribution. SVGD has attracted interest for application in machine-learning techniques such as Bayesian inference. In this paper, we propose novel trainable algorithms that incorporate a deep-learning technique called deep unfolding,into SVGD. This approach facilitates the learning of the internal parameters of SVGD, thereby accelerating its convergence speed. To evaluate the proposed trainable SVGD algorithms, we conducted numerical simulations of three tasks: sampling a one-dimensional Gaussian mixture, performing Bayesian logistic regression, and learning Bayesian neural networks. The results show that our proposed algorithms exhibit faster convergence than the conventional variants of SVGD.

Tasks

Reproductions