34
0

Efficient Unstructured Pruning of Mamba State-Space Models for Resource-Constrained Environments

Abstract

State-space models (SSMs), particularly the Mamba architecture, have emerged as powerful alternatives to Transformers for sequence modeling, offering linear-time complexity and competitive performance across diverse tasks. However, their large parameter counts pose significant challenges for deployment in resource-constrained environments. We propose a novel unstructured pruning framework tailored for Mamba models that achieves up to 70\% parameter reduction while retaining over 95\% of the original performance. Our approach integrates three key innovations: (1) a gradient-aware magnitude pruning technique that combines weight magnitude and gradient information to identify less critical parameters, (2) an iterative pruning schedule that gradually increases sparsity to maintain model stability, and (3) a global pruning strategy that optimizes parameter allocation across the entire model. Through extensive experiments on WikiText-103, Long Range Arena, and ETT time-series benchmarks, we demonstrate significant efficiency gains with minimal performance degradation. Our analysis of pruning effects on Mamba's components reveals critical insights into the architecture's redundancy and robustness, enabling practical deployment in resource-constrained settings while broadening Mamba's applicability.

View on arXiv
@article{shihab2025_2505.08299,
  title={ Efficient Unstructured Pruning of Mamba State-Space Models for Resource-Constrained Environments },
  author={ Ibne Farabi Shihab and Sanjeda Akter and Anuj Sharma },
  journal={arXiv preprint arXiv:2505.08299},
  year={ 2025 }
}
Comments on this paper