ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.11018
14
0

Rethinking the Mean Teacher Strategy from the Perspective of Self-paced Learning

16 May 2025
Pengchen Zhang
Alan J.X. Guo
Sipin Luo
Zhe Han
Lin Guo
ArXivPDFHTML
Abstract

Semi-supervised medical image segmentation has attracted significant attention due to its potential to reduce manual annotation costs. The mean teacher (MT) strategy, commonly understood as introducing smoothed, temporally lagged consistency regularization, has demonstrated strong performance across various tasks in this field. In this work, we reinterpret the MT strategy on supervised data as a form of self-paced learning, regulated by the output agreement between the temporally lagged teacher model and the ground truth labels. This idea is further extended to incorporate agreement between a temporally lagged model and a cross-architectural model, which offers greater flexibility in regulating the learning pace and enables application to unlabeled data. Specifically, we propose dual teacher-student learning (DTSL), a framework that introduces two groups of teacher-student models with different architectures. The output agreement between the cross-group teacher and student models is used as pseudo-labels, generated via a Jensen-Shannon divergence-based consensus label generator (CLG). Extensive experiments on popular datasets demonstrate that the proposed method consistently outperforms existing state-of-the-art approaches. Ablation studies further validate the effectiveness of the proposed modules.

View on arXiv
@article{zhang2025_2505.11018,
  title={ Rethinking the Mean Teacher Strategy from the Perspective of Self-paced Learning },
  author={ Pengchen Zhang and Alan J.X. Guo and Sipin Luo and Zhe Han and Lin Guo },
  journal={arXiv preprint arXiv:2505.11018},
  year={ 2025 }
}
Comments on this paper