The Gaussian Neural Process
2021-01-10pproximateinference AABI Symposium 2021Code Available1· sign in to hype
Wessel P. Bruinsma, James Requeima, Andrew Y. K. Foong, Jonathan Gordon, Richard E. Turner
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/wesselb/NeuralProcesses.jlOfficialIn papernone★ 76
Abstract
Neural Processes (NPs; Garnelo et al., 2018a,b) are a rich class of models for meta-learning that map data sets directly to predictive stochastic processes. We provide a rigorous analysis of the standard maximum-likelihood objective used to train conditional NPs. Moreover, we propose a new member to the Neural Process family called the Gaussian Neural Process (GNP), which models predictive correlations, incorporates translation equivariance, provides universal approximation guarantees, and demonstrates encouraging performance.