ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1705.10813
44
6
v1v2v3 (latest)

Large Linear Multi-output Gaussian Process Learning for Time Series

30 May 2017
Vladimir Feinberg
Li-Fang Cheng
Kai Li
Barbara E. Engelhardt
    GP
ArXiv (abs)PDFHTML
Abstract

Gaussian processes, or distributions over arbitrary functions in a continuous domain, can be generalized to the multi-output case: a linear model of coregionalization (LMC) is one approach. LMCs estimate and exploit correlations across the multiple outputs. While model estimation can be performed efficiently for single-output GPs, these assume stationarity, but in the multi-output case the cross-covariance interaction is not stationary. We propose Large Linear GPs (LLGPs), which circumvent the need for stationarity by using LMC's structure, enabling optimization of GP hyperparameters for multi-dimensional outputs and one-dimensional inputs. When applied to real time series data, we find our theoretical improvement relative to the current state of the art is realized with LLGP being generally an order of magnitude faster while improving or maintaining predictive accuracy.

View on arXiv
Comments on this paper