ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.13170
63
12

Variational Orthogonal Features

23 June 2020
David R. Burt
C. Rasmussen
Mark van der Wilk
    BDLDRL
ArXiv (abs)PDFHTML
Abstract

Sparse stochastic variational inference allows Gaussian process models to be applied to large datasets. The per iteration computational cost of inference with this method is O(N~M2+M3),\mathcal{O}(\tilde{N}M^2+M^3),O(N~M2+M3), where N~\tilde{N}N~ is the number of points in a minibatch and MMM is the number of `inducing features', which determine the expressiveness of the variational family. Several recent works have shown that for certain priors, features can be defined that remove the O(M3)\mathcal{O}(M^3)O(M3) cost of computing a minibatch estimate of an evidence lower bound (ELBO). This represents a significant computational savings when M≫N~M\gg \tilde{N}M≫N~. We present a construction of features for any stationary prior kernel that allow for computation of an unbiased estimator to the ELBO using TTT Monte Carlo samples in O(N~T+M2T)\mathcal{O}(\tilde{N}T+M^2T)O(N~T+M2T) and in O(N~T+MT)\mathcal{O}(\tilde{N}T+MT)O(N~T+MT) with an additional approximation. We analyze the impact of this additional approximation on inference quality.

View on arXiv
Comments on this paper