ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.07109
9
2

Nyström Regularization for Time Series Forecasting

13 November 2021
Zirui Sun
Mingwei Dai
Yao Wang
Shao-Bo Lin
    AI4TS
ArXivPDFHTML
Abstract

This paper focuses on learning rate analysis of Nystr\"{o}m regularization with sequential sub-sampling for τ\tauτ-mixing time series. Using a recently developed Banach-valued Bernstein inequality for τ\tauτ-mixing sequences and an integral operator approach based on second-order decomposition, we succeed in deriving almost optimal learning rates of Nystr\"{o}m regularization with sequential sub-sampling for τ\tauτ-mixing time series. A series of numerical experiments are carried out to verify our theoretical results, showing the excellent learning performance of Nystr\"{o}m regularization with sequential sub-sampling in learning massive time series data. All these results extend the applicable range of Nystr\"{o}m regularization from i.i.d. samples to non-i.i.d. sequences.

View on arXiv
Comments on this paper