ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.13304
  4. Cited By
Understanding Self-Distillation in the Presence of Label Noise

Understanding Self-Distillation in the Presence of Label Noise

30 January 2023
Rudrajit Das
Sujay Sanghavi
ArXivPDFHTML

Papers citing "Understanding Self-Distillation in the Presence of Label Noise"

6 / 6 papers shown
Title
The Effect of Optimal Self-Distillation in Noisy Gaussian Mixture Model
The Effect of Optimal Self-Distillation in Noisy Gaussian Mixture Model
Kaito Takanami
Takashi Takahashi
Ayaka Sakata
29
0
0
27 Jan 2025
Provable Weak-to-Strong Generalization via Benign Overfitting
Provable Weak-to-Strong Generalization via Benign Overfitting
David X. Wu
A. Sahai
58
6
0
06 Oct 2024
Retraining with Predicted Hard Labels Provably Increases Model Accuracy
Retraining with Predicted Hard Labels Provably Increases Model Accuracy
Rudrajit Das
Inderjit S Dhillon
Alessandro Epasto
Adel Javanmard
Jieming Mao
Vahab Mirrokni
Sujay Sanghavi
Peilin Zhong
44
1
0
17 Jun 2024
Toward Understanding Privileged Features Distillation in
  Learning-to-Rank
Toward Understanding Privileged Features Distillation in Learning-to-Rank
Shuo Yang
Sujay Sanghavi
Holakou Rahmanian
J. Bakus
S.V.N. Vishwanathan
113
14
0
19 Sep 2022
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data
  Efficiency and Imperfect Teacher
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
Guangda Ji
Zhanxing Zhu
51
42
0
20 Oct 2020
Meta Pseudo Labels
Meta Pseudo Labels
Hieu H. Pham
Zihang Dai
Qizhe Xie
Minh-Thang Luong
Quoc V. Le
VLM
248
656
0
23 Mar 2020
1