JANA: Jointly Amortized Neural Approximation of Complex Bayesian Models
Stefan T. Radev, Marvin Schmitt, Valentin Pratz, Umberto Picchini, Ullrich Köthe, Paul-Christian Bürkner
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/stefanradev93/BayesFlowOfficialIn papertf★ 647
- github.com/bayesflow-org/jana-paperOfficialIn paperpytorch★ 9
- github.com/bayesflow-org/bayesflowtf★ 647
- github.com/bayesflow-org/hierarchical-model-comparisontf★ 7
Abstract
This work proposes ``jointly amortized neural approximation'' (JANA) of intractable likelihood functions and posterior densities arising in Bayesian surrogate modeling and simulation-based inference. We train three complementary networks in an end-to-end fashion: 1) a summary network to compress individual data points, sets, or time series into informative embedding vectors; 2) a posterior network to learn an amortized approximate posterior; and 3) a likelihood network to learn an amortized approximate likelihood. Their interaction opens a new route to amortized marginal likelihood and posterior predictive estimation -- two important ingredients of Bayesian workflows that are often too expensive for standard methods. We benchmark the fidelity of JANA on a variety of simulation models against state-of-the-art Bayesian methods and propose a powerful and interpretable diagnostic for joint calibration. In addition, we investigate the ability of recurrent likelihood networks to emulate complex time series models without resorting to hand-crafted summary statistics.