SOTAVerified

Bayesian hierarchical stacking: Some models are (somewhere) useful

2021-01-22Code Available1· sign in to hype

Yuling Yao, Gregor Pirš, Aki Vehtari, Andrew Gelman

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Stacking is a widely used model averaging technique that asymptotically yields optimal predictions among linear averages. We show that stacking is most effective when model predictive performance is heterogeneous in inputs, and we can further improve the stacked mixture with a hierarchical model. We generalize stacking to Bayesian hierarchical stacking. The model weights are varying as a function of data, partially-pooled, and inferred using Bayesian inference. We further incorporate discrete and continuous inputs, other structured priors, and time series and longitudinal data. To verify the performance gain of the proposed method, we derive theory bounds, and demonstrate on several applied problems.

Tasks

Reproductions