217
v1v2v3 (latest)

CodeSSM: Towards State Space Models for Code Understanding

Main:9 Pages
7 Figures
Bibliography:4 Pages
12 Tables
Appendix:4 Pages
Abstract

Although transformers dominate many code-specific tasks, they have significant limitations. This paper explores State Space Models (SSMs) as a promising alternative for code understanding tasks such as retrieval, classification, and clone detection. We introduce CodeSSM, the first SSM-based model trained on code corpora to assess its effectiveness. Our results demonstrate that SSMs are more sample-efficient and can extrapolate to longer contexts beyond the pretraining length. Extensive experiments show that SSMs offer a viable alternative to transformers, addressing several their limitations. Additionally, CodeSSM reduces memory usage by up to 64\% compared to transformers at a context length of 2048, with greater savings as context length grows.

View on arXiv
Comments on this paper