70
0

Ferret: An Efficient Online Continual Learning Framework under Varying Memory Constraints

Abstract

In the realm of high-frequency data streams, achieving real-time learning within varying memory constraints is paramount. This paper presents Ferret, a comprehensive framework designed to enhance online accuracy of Online Continual Learning (OCL) algorithms while dynamically adapting to varying memory budgets. Ferret employs a fine-grained pipeline parallelism strategy combined with an iterative gradient compensation algorithm, ensuring seamless handling of high-frequency data with minimal latency, and effectively counteracting the challenge of stale gradients in parallel training. To adapt to varying memory budgets, its automated model partitioning and pipeline planning optimizes performance regardless of memory limitations. Extensive experiments across 20 benchmarks and 5 integrated OCL algorithms show Ferret's remarkable efficiency, achieving up to 3.7×\times lower memory overhead to reach the same online accuracy compared to competing methods. Furthermore, Ferret consistently outperforms these methods across diverse memory budgets, underscoring its superior adaptability. These findings position Ferret as a premier solution for efficient and adaptive OCL framework in real-time environments.

View on arXiv
@article{zhou2025_2503.12053,
  title={ Ferret: An Efficient Online Continual Learning Framework under Varying Memory Constraints },
  author={ Yuhao Zhou and Yuxin Tian and Jindi Lv and Mingjia Shi and Yuanxi Li and Qing Ye and Shuhao Zhang and Jiancheng Lv },
  journal={arXiv preprint arXiv:2503.12053},
  year={ 2025 }
}
Comments on this paper