The Renyi Gaussian Process: Towards Improved Generalization
2019-10-15Unverified0· sign in to hype
Xubo Yue, Raed Kontar
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We introduce an alternative closed form lower bound on the Gaussian process (GP) likelihood based on the R\'enyi -divergence. This new lower bound can be viewed as a convex combination of the Nystr\"om approximation and the exact GP. The key advantage of this bound, is its capability to control and tune the enforced regularization on the model and thus is a generalization of the traditional variational GP regression. From a theoretical perspective, we provide the convergence rate and risk bound for inference using our proposed approach. Experiments on real data show that the proposed algorithm may be able to deliver improvement over several GP inference methods.