ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.03042
59
0

Learning from Noisy Labels with Contrastive Co-Transformer

4 March 2025
Yan Han
S. Roy
Mehrtash Harandi
L. Petersson
    NoLa
ArXivPDFHTML
Abstract

Deep learning with noisy labels is an interesting challenge in weakly supervised learning. Despite their significant learning capacity, CNNs have a tendency to overfit in the presence of samples with noisy labels. Alleviating this issue, the well known Co-Training framework is used as a fundamental basis for our work. In this paper, we introduce a Contrastive Co-Transformer framework, which is simple and fast, yet able to improve the performance by a large margin compared to the state-of-the-art approaches. We argue the robustness of transformers when dealing with label noise. Our Contrastive Co-Transformer approach is able to utilize all samples in the dataset, irrespective of whether they are clean or noisy. Transformers are trained by a combination of contrastive loss and classification loss. Extensive experimental results on corrupted data from six standard benchmark datasets including Clothing1M, demonstrate that our Contrastive Co-Transformer is superior to existing state-of-the-art methods.

View on arXiv
@article{han2025_2503.03042,
  title={ Learning from Noisy Labels with Contrastive Co-Transformer },
  author={ Yan Han and Soumava Kumar Roy and Mehrtash Harandi and Lars Petersson },
  journal={arXiv preprint arXiv:2503.03042},
  year={ 2025 }
}
Comments on this paper