Dual Parameterization of Sparse Variational Gaussian Processes
2021-11-05NeurIPS 2021Code Available0· sign in to hype
Vincent Adam, Paul E. Chang, Mohammad Emtiyaz Khan, Arno Solin
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/AaltoML/t-SVGPOfficialIn papertf★ 9
Abstract
Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian process inference because of their computational benefits. In this paper, we improve their computational efficiency by using a dual parameterization where each data example is assigned dual parameters, similarly to site parameters used in expectation propagation. Our dual parameterization speeds-up inference using natural gradient descent, and provides a tighter evidence lower bound for hyperparameter learning. The approach has the same memory cost as the current SVGP methods, but it is faster and more accurate.