ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.01454
235
36

Kernel Interpolation for Scalable Online Gaussian Processes

International Conference on Artificial Intelligence and Statistics (AISTATS), 2021
2 March 2021
Samuel Stanton
Wesley J. Maddox
Ian A. Delbridge
A. Wilson
    GP
ArXiv (abs)PDFHTMLGithub (62★)
Abstract

Gaussian processes (GPs) provide a gold standard for performance in online settings, such as sample-efficient control and black box optimization, where we need to update a posterior distribution as we acquire data in a sequential fashion. However, updating a GP posterior to accommodate even a single new observation after having observed nnn points incurs at least O(n)O(n)O(n) computations in the exact setting. We show how to use structured kernel interpolation to efficiently recycle computations for constant-time O(1)O(1)O(1) online updates with respect to the number of points nnn, while retaining exact inference. We demonstrate the promise of our approach in a range of online regression and classification settings, Bayesian optimization, and active sampling to reduce error in malaria incidence forecasting. Code is available at https://github.com/wjmaddox/online_gp.

View on arXiv
Comments on this paper