ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.05690
16
0

Long-Context Linear System Identification

8 October 2024
Oğuz Kaan Yüksel
Mathieu Even
Nicolas Flammarion
ArXivPDFHTML
Abstract

This paper addresses the problem of long-context linear system identification, where the state xtx_txt​ of a dynamical system at time ttt depends linearly on previous states xsx_sxs​ over a fixed context window of length ppp. We establish a sample complexity bound that matches the i.i.d. parametric rate up to logarithmic factors for a broad class of systems, extending previous works that considered only first-order dependencies. Our findings reveal a learning-without-mixing phenomenon, indicating that learning long-context linear autoregressive models is not hindered by slow mixing properties potentially associated with extended context windows. Additionally, we extend these results to (i) shared low-rank representations, where rank-regularized estimators improve rates with respect to dimensionality, and (ii) misspecified context lengths in strictly stable systems, where shorter contexts offer statistical advantages.

View on arXiv
Comments on this paper