ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1602.04265
169
8
v1v2v3v4 (latest)

Lasso Guarantees for Time Series Estimation Under Subgaussian Tails and β ββ-Mixing

12 February 2016
Kam Chung Wong
Zifan Li
Ambuj Tewari
ArXiv (abs)PDFHTML
Abstract

Many theoretical results on estimation of high dimensional time series require specifying an underlying data generating model (DGM). Instead, this paper relies only on (strict) stationarity and β \beta β-mixing condition to establish consistency of the Lasso when data comes from a β\betaβ-mixing process with marginals having subgaussian tails. We establish non-asymptotic inequalities for estimation and prediction errors of the Lasso estimate of the best linear predictor in dependent data. Applications of these results potentially extend to non-Gaussian, non-Markovian and non-linear times series models as the examples we provide demonstrate. In order to prove our results, we derive a novel Hanson-Wright type concentration inequality for β\betaβ-mixing subgaussian random vectors that may be of independent interest.

View on arXiv
Comments on this paper