SOTAVerified

Scalable Cross Validation Losses for Gaussian Process Models

2021-05-24Unverified0· sign in to hype

Martin Jankowiak, Geoff Pleiss

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We introduce a simple and scalable method for training Gaussian process (GP) models that exploits cross-validation and nearest neighbor truncation. To accommodate binary and multi-class classification we leverage P\`olya-Gamma auxiliary variables and variational inference. In an extensive empirical comparison with a number of alternative methods for scalable GP regression and classification, we find that our method offers fast training and excellent predictive performance. We argue that the good predictive performance can be traced to the non-parametric nature of the resulting predictive distributions as well as to the cross-validation loss, which provides robustness against model mis-specification.

Tasks

Reproductions