SOTAVerified

Computing the quality of the Laplace approximation

2017-11-24Unverified0· sign in to hype

Guillaume P. Dehaene

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Bayesian inference requires approximation methods to become computable, but for most of them it is impossible to quantify how close the approximation is to the true posterior. In this work, we present a theorem upper-bounding the KL divergence between a log-concave target density f() and its Laplace approximation g(). The bound we present is computable: on the classical logistic regression model, we find our bound to be almost exact as long as the dimensionality of the parameter space is high. The approach we followed in this work can be extended to other Gaussian approximations, as we will do in an extended version of this work, to be submitted to the Annals of Statistics. It will then become a critical tool for characterizing whether, for a given problem, a given Gaussian approximation is suitable, or whether a more precise alternative method should be used instead.

Tasks

Reproductions