51
0

Generative Binary Memory: Pseudo-Replay Class-Incremental Learning on Binarized Embeddings

Abstract

In dynamic environments where new concepts continuously emerge, Deep Neural Networks (DNNs) must adapt by learning new classes while retaining previously acquired ones. This challenge is addressed by Class-Incremental Learning (CIL). This paper introduces Generative Binary Memory (GBM), a novel CIL pseudo-replay approach which generates synthetic binary pseudo-exemplars. Relying on Bernoulli Mixture Models (BMMs), GBM effectively models the multi-modal characteristics of class distributions, in a latent, binary space. With a specifically-designed feature binarizer, our approach applies to any conventional DNN. GBM also natively supports Binary Neural Networks (BNNs) for highly-constrained model sizes in embedded systems. The experimental results demonstrate that GBM achieves higher than state-of-the-art average accuracy on CIFAR100 (+2.9%) and TinyImageNet (+1.5%) for a ResNet-18 equipped with our binarizer. GBM also outperforms emerging CIL methods for BNNs, with +3.1% in final accuracy and x4.7 memory reduction, on CORE50.

View on arXiv
@article{basso-bert2025_2503.10333,
  title={ Generative Binary Memory: Pseudo-Replay Class-Incremental Learning on Binarized Embeddings },
  author={ Yanis Basso-Bert and Anca Molnos and Romain Lemaire and William Guicquero and Antoine Dupret },
  journal={arXiv preprint arXiv:2503.10333},
  year={ 2025 }
}
Comments on this paper