1

Semantic-Guided Dynamic Sparsification for Pre-Trained Model-based Class-Incremental Learning

Ruiqi Liu
Boyu Diao
Zijia An
Runjie Shao
Zhulin An
Fei Wang
Yongjun Xu
Main:8 Pages
9 Figures
Bibliography:3 Pages
3 Tables
Appendix:5 Pages
Abstract

Class-Incremental Learning (CIL) requires a model to continually learn new classes without forgetting old ones. A common and efficient solution freezes a pre-trained model and employs lightweight adapters, whose parameters are often forced to be orthogonal to prevent inter-task interference. However, we argue that this parameter-constraining method is detrimental to plasticity. To this end, we propose Semantic-Guided Dynamic Sparsification (SGDS), a novel method that proactively guides the activation space by governing the orientation and rank of its subspaces through targeted sparsification. Specifically, SGDS promotes knowledge transfer by encouraging similar classes to share a compact activation subspace, while simultaneously preventing interference by assigning non-overlapping activation subspaces to dissimilar classes. By sculpting class-specific sparse subspaces in the activation space, SGDS effectively mitigates interference without imposing rigid constraints on the parameter space. Extensive experiments on various benchmark datasets demonstrate the state-of-the-art performance of SGDS.

View on arXiv
Comments on this paper