44

CD^2: Constrained Dataset Distillation for Few-Shot Class-Incremental Learning

International Joint Conference on Artificial Intelligence (IJCAI), 2025
Kexin Bao
Daichi Zhang
Hansong Zhang
Yong Li
Yutao Yue
Shiming Ge
Main:7 Pages
5 Figures
Bibliography:2 Pages
3 Tables
Abstract

Few-shot class-incremental learning (FSCIL) receives significant attention from the public to perform classification continuously with a few training samples, which suffers from the key catastrophic forgetting problem. Existing methods usually employ an external memory to store previous knowledge and treat it with incremental classes equally, which cannot properly preserve previous essential knowledge. To solve this problem and inspired by recent distillation works on knowledge transfer, we propose a framework termed \textbf{C}onstrained \textbf{D}ataset \textbf{D}istillation (\textbf{CD2^2}) to facilitate FSCIL, which includes a dataset distillation module (\textbf{DDM}) and a distillation constraint module~(\textbf{DCM}). Specifically, the DDM synthesizes highly condensed samples guided by the classifier, forcing the model to learn compacted essential class-related clues from a few incremental samples. The DCM introduces a designed loss to constrain the previously learned class distribution, which can preserve distilled knowledge more sufficiently. Extensive experiments on three public datasets show the superiority of our method against other state-of-the-art competitors.

View on arXiv
Comments on this paper