ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.05540
56
0

Demystifying Catastrophic Forgetting in Two-Stage Incremental Object Detector

8 February 2025
Qirui Wu
Shizhou Zhang
De-Chun Cheng
Yinghui Xing
Di Xu
Peng Wang
Yanning Zhang
    ObjD
ArXivPDFHTML
Abstract

Catastrophic forgetting is a critical chanllenge for incremental object detection (IOD). Most existing methods treat the detector monolithically, relying on instance replay or knowledge distillation without analyzing component-specific forgetting. Through dissection of Faster R-CNN, we reveal a key insight: Catastrophic forgetting is predominantly localized to the RoI Head classifier, while regressors retain robustness across incremental stages. This finding challenges conventional assumptions, motivating us to develop a framework termed NSGP-RePRE. Regional Prototype Replay (RePRE) mitigates classifier forgetting via replay of two types of prototypes: coarse prototypes represent class-wise semantic centers of RoI features, while fine-grained prototypes model intra-class variations. Null Space Gradient Projection (NSGP) is further introduced to eliminate prototype-feature misalignment by updating the feature extractor in directions orthogonal to subspace of old inputs via gradient projection, aligning RePRE with incremental learning dynamics. Our simple yet effective design allows NSGP-RePRE to achieve state-of-the-art performance on the Pascal VOC and MS COCO datasets under various settings. Our work not only advances IOD methodology but also provide pivotal insights for catastrophic forgetting mitigation in IOD. Code will be available soon.

View on arXiv
@article{wu2025_2502.05540,
  title={ Demystifying Catastrophic Forgetting in Two-Stage Incremental Object Detector },
  author={ Qirui Wu and Shizhou Zhang and De Cheng and Yinghui Xing and Di Xu and Peng Wang and Yanning Zhang },
  journal={arXiv preprint arXiv:2502.05540},
  year={ 2025 }
}
Comments on this paper