ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2509.21716
99
2

A Unifying Framework for Parallelizing Sequential Models with Linear Dynamical Systems

26 September 2025
Xavier Gonzalez
E. Kelly Buchanan
Hyun Dong Lee
Jerry W. Liu
Ke Alexander Wang
D. Zoltowski
Christopher Ré
Scott W. Linderman
ArXiv (abs)PDFHTMLGithub (1★)
Main:12 Pages
8 Figures
Bibliography:5 Pages
4 Tables
Appendix:10 Pages
Abstract

Harnessing parallelism in seemingly sequential models is a central challenge for modern machine learning. Several approaches have been proposed for evaluating sequential processes in parallel using fixed-point methods, like Newton, Picard, and Jacobi iterations. In this work, we show that these methods can be understood within a common framework based on linear dynamical systems (LDSs), where different iteration schemes arise naturally as approximate linearizations of a nonlinear recursion. This unifying view highlights shared principles behind these techniques and clarifies when particular fixed-point methods are most likely to be effective. By bridging diverse algorithms through the language of LDSs, our framework provides a clearer theoretical foundation for parallelizing sequential models and points toward new opportunities for efficient and scalable computation.

View on arXiv
Comments on this paper