62
0

Generalization Error Analysis for Selective State-Space Models Through the Lens of Attention

Abstract

State-space models (SSMs) are a new class of foundation models that have emerged as a compelling alternative to Transformers and their attention mechanisms for sequence processing tasks. This paper provides a detailed theoretical analysis of selective SSMs, the core components of the Mamba and Mamba-2 architectures. We leverage the connection between selective SSMs and the self-attention mechanism to highlight the fundamental similarities between these models. Building on this connection, we establish a length independent covering number-based generalization bound for selective SSMs, providing a deeper understanding of their theoretical performance guarantees. We analyze the effects of state matrix stability and input-dependent discretization, shedding light on the critical role played by these factors in the generalization capabilities of selective SSMs. Finally, we empirically demonstrate the sequence length independence of the derived bounds on two tasks.

View on arXiv
@article{honarpisheh2025_2502.01473,
  title={ Generalization Error Analysis for Selective State-Space Models Through the Lens of Attention },
  author={ Arya Honarpisheh and Mustafa Bozdag and Mario Sznaier and Octavia Camps },
  journal={arXiv preprint arXiv:2502.01473},
  year={ 2025 }
}
Comments on this paper