ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.03687
  4. Cited By
When Optimizing $f$-divergence is Robust with Label Noise

When Optimizing fff-divergence is Robust with Label Noise

7 November 2020
Jiaheng Wei
Yang Liu
ArXivPDFHTML

Papers citing "When Optimizing $f$-divergence is Robust with Label Noise"

11 / 11 papers shown
Title
Enhanced Sample Selection with Confidence Tracking: Identifying Correctly Labeled yet Hard-to-Learn Samples in Noisy Data
Enhanced Sample Selection with Confidence Tracking: Identifying Correctly Labeled yet Hard-to-Learn Samples in Noisy Data
Weiran Pan
Wei Wei
Feida Zhu
Yong Deng
NoLa
133
0
0
24 Apr 2025
Fairness Improves Learning from Noisily Labeled Long-Tailed Data
Fairness Improves Learning from Noisily Labeled Long-Tailed Data
Jiaheng Wei
Zhaowei Zhu
Gang Niu
Tongliang Liu
Sijia Liu
Masashi Sugiyama
Yang Liu
26
6
0
22 Mar 2023
Smooth and Stepwise Self-Distillation for Object Detection
Smooth and Stepwise Self-Distillation for Object Detection
Jieren Deng
Xiaoxia Zhou
Hao Tian
Zhihong Pan
Derek Aguiar
ObjD
23
0
0
09 Mar 2023
When Noisy Labels Meet Long Tail Dilemmas: A Representation Calibration
  Method
When Noisy Labels Meet Long Tail Dilemmas: A Representation Calibration Method
Manyi Zhang
Xuyang Zhao
Jun Yao
Chun Yuan
Weiran Huang
27
19
0
20 Nov 2022
SplitNet: Learnable Clean-Noisy Label Splitting for Learning with Noisy
  Labels
SplitNet: Learnable Clean-Noisy Label Splitting for Learning with Noisy Labels
Daehwan Kim
Kwang-seok Ryoo
Hansang Cho
Seung Wook Kim
NoLa
24
3
0
20 Nov 2022
Tackling Instance-Dependent Label Noise with Dynamic Distribution
  Calibration
Tackling Instance-Dependent Label Noise with Dynamic Distribution Calibration
Manyi Zhang
Yuxin Ren
Zihao W. Wang
C. Yuan
16
3
0
11 Oct 2022
Are All Losses Created Equal: A Neural Collapse Perspective
Are All Losses Created Equal: A Neural Collapse Perspective
Jinxin Zhou
Chong You
Xiao Li
Kangning Liu
Sheng Liu
Qing Qu
Zhihui Zhu
25
58
0
04 Oct 2022
Beyond Images: Label Noise Transition Matrix Estimation for Tasks with
  Lower-Quality Features
Beyond Images: Label Noise Transition Matrix Estimation for Tasks with Lower-Quality Features
Zhaowei Zhu
Jialu Wang
Yang Liu
NoLa
24
37
0
02 Feb 2022
Mitigating Memorization of Noisy Labels via Regularization between
  Representations
Mitigating Memorization of Noisy Labels via Regularization between Representations
Hao Cheng
Zhaowei Zhu
Xing Sun
Yang Liu
NoLa
33
28
0
18 Oct 2021
To Smooth or Not? When Label Smoothing Meets Noisy Labels
To Smooth or Not? When Label Smoothing Meets Noisy Labels
Jiaheng Wei
Hangyu Liu
Tongliang Liu
Gang Niu
Masashi Sugiyama
Yang Liu
NoLa
32
69
0
08 Jun 2021
Generalized Jensen-Shannon Divergence Loss for Learning with Noisy
  Labels
Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels
Erik Englesson
Hossein Azizpour
NoLa
26
103
0
10 May 2021
1