97
3

The Renyi Gaussian Process: Towards Improved Generalization

Abstract

We introduce an alternative closed form lower bound on the Gaussian process (GP\mathcal{GP}) likelihood based on the R\ényi α\alpha-divergence. This new lower bound can be viewed as a convex combination of the Nystr\"om approximation and the exact GP\mathcal{GP}. The key advantage of this bound, is its capability to control and tune the enforced regularization on the model and thus is a generalization of the traditional variational GP\mathcal{GP} regression. From a theoretical perspective, we provide the convergence rate and risk bound for inference using our proposed approach. Experiments on real data show that the proposed algorithm may be able to deliver improvement over several GP\mathcal{GP} inference methods.

View on arXiv
Comments on this paper