SOTAVerified

Kernel Implicit Variational Inference

2017-05-29ICLR 2018Unverified0· sign in to hype

Jiaxin Shi, Shengyang Sun, Jun Zhu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Recent progress in variational inference has paid much attention to the flexibility of variational posteriors. One promising direction is to use implicit distributions, i.e., distributions without tractable densities as the variational posterior. However, existing methods on implicit posteriors still face challenges of noisy estimation and computational infeasibility when applied to models with high-dimensional latent variables. In this paper, we present a new approach named Kernel Implicit Variational Inference that addresses these challenges. As far as we know, for the first time implicit variational inference is successfully applied to Bayesian neural networks, which shows promising results on both regression and classification tasks.

Tasks

Reproductions