ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.14727
30
0

Semi-parametric Memory Consolidation: Towards Brain-like Deep Continual Learning

20 April 2025
Geng Liu
Fei Zhu
Rong Feng
Zhiqiang Yi
Shiqi Wang
Gaofeng Meng
Zhaoxiang Zhang
    CLL
ArXivPDFHTML
Abstract

Humans and most animals inherently possess a distinctive capacity to continually acquire novel experiences and accumulate worldly knowledge over time. This ability, termed continual learning, is also critical for deep neural networks (DNNs) to adapt to the dynamically evolving world in open environments. However, DNNs notoriously suffer from catastrophic forgetting of previously learned knowledge when trained on sequential tasks. In this work, inspired by the interactive human memory and learning system, we propose a novel biomimetic continual learning framework that integrates semi-parametric memory and the wake-sleep consolidation mechanism. For the first time, our method enables deep neural networks to retain high performance on novel tasks while maintaining prior knowledge in real-world challenging continual learning scenarios, e.g., class-incremental learning on ImageNet. This study demonstrates that emulating biological intelligence provides a promising path to enable deep neural networks with continual learning capabilities.

View on arXiv
@article{liu2025_2504.14727,
  title={ Semi-parametric Memory Consolidation: Towards Brain-like Deep Continual Learning },
  author={ Geng Liu and Fei Zhu and Rong Feng and Zhiqiang Yi and Shiqi Wang and Gaofeng Meng and Zhaoxiang Zhang },
  journal={arXiv preprint arXiv:2504.14727},
  year={ 2025 }
}
Comments on this paper