ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1807.02537
  4. Cited By
Fully Scalable Gaussian Processes using Subspace Inducing Inputs
v1v2 (latest)

Fully Scalable Gaussian Processes using Subspace Inducing Inputs

6 July 2018
A. Panos
P. Dellaportas
Michalis K. Titsias
ArXiv (abs)PDFHTML

Papers citing "Fully Scalable Gaussian Processes using Subspace Inducing Inputs"

4 / 4 papers shown
Title
Numerically Stable Sparse Gaussian Processes via Minimum Separation
  using Cover Trees
Numerically Stable Sparse Gaussian Processes via Minimum Separation using Cover Trees
Alexander Terenin
David R. Burt
A. Artemev
Seth Flaxman
Mark van der Wilk
C. Rasmussen
Hong Ge
99
7
0
14 Oct 2022
Conditioning Sparse Variational Gaussian Processes for Online
  Decision-making
Conditioning Sparse Variational Gaussian Processes for Online Decision-making
Wesley J. Maddox
Samuel Stanton
A. Wilson
76
32
0
28 Oct 2021
Global inducing point variational posteriors for Bayesian neural
  networks and deep Gaussian processes
Global inducing point variational posteriors for Bayesian neural networks and deep Gaussian processes
Sebastian W. Ober
Laurence Aitchison
BDL
112
60
0
17 May 2020
Weakly-supervised Multi-output Regression via Correlated Gaussian
  Processes
Weakly-supervised Multi-output Regression via Correlated Gaussian Processes
Seokhyun Chung
Raed Al Kontar
Zhenke Wu
30
5
0
19 Feb 2020
1