14
28

Kernel Interpolation for Scalable Online Gaussian Processes

Abstract

Gaussian processes (GPs) provide a gold standard for performance in online settings, such as sample-efficient control and black box optimization, where we need to update a posterior distribution as we acquire data in a sequential fashion. However, updating a GP posterior to accommodate even a single new observation after having observed nn points incurs at least O(n)O(n) computations in the exact setting. We show how to use structured kernel interpolation to efficiently recycle computations for constant-time O(1)O(1) online updates with respect to the number of points nn, while retaining exact inference. We demonstrate the promise of our approach in a range of online regression and classification settings, Bayesian optimization, and active sampling to reduce error in malaria incidence forecasting. Code is available at https://github.com/wjmaddox/online_gp.

View on arXiv
Comments on this paper