ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.05657
11
32

Certainty Driven Consistency Loss on Multi-Teacher Networks for Semi-Supervised Learning

17 January 2019
Lu Liu
R. Tan
ArXivPDFHTML
Abstract

One of the successful approaches in semi-supervised learning is based on the consistency regularization. Typically, a student model is trained to be consistent with teacher prediction for the inputs under different perturbations. To be successful, the prediction targets given by teacher should have good quality, otherwise the student can be misled by teacher. Unfortunately, existing methods do not assess the quality of the teacher targets. In this paper, we propose a novel Certainty-driven Consistency Loss (CCL) that exploits the predictive uncertainty in the consistency loss to let the student dynamically learn from reliable targets. Specifically, we propose two approaches, i.e. Filtering CCL and Temperature CCL to either filter out uncertain predictions or pay less attention on them in the consistency regularization. We further introduce a novel decoupled framework to encourage model difference. Experimental results on SVHN, CIFAR-10, and CIFAR-100 demonstrate the advantages of our method over a few existing methods.

View on arXiv
Comments on this paper