ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.15163
33
0

Learning with Noisy Labels: the Exploration of Error Bounds in Classification

28 January 2025
Haixia Liu
Boxiao Li
Can Yang
Yang Wang
ArXivPDFHTML
Abstract

Numerous studies have shown that label noise can lead to poor generalization performance, negatively affecting classification accuracy. Therefore, understanding the effectiveness of classifiers trained using deep neural networks in the presence of noisy labels is of considerable practical significance. In this paper, we focus on the error bounds of excess risks for classification problems with noisy labels within deep learning frameworks. We begin by exploring loss functions with noise-tolerant properties, ensuring that the empirical minimizer on noisy data aligns with that on the true data. Next, we estimate the error bounds of the excess risks, expressed as a sum of statistical error and approximation error. We estimate the statistical error on a dependent (mixing) sequence, bounding it with the help of the associated independent block sequence. For the approximation error, we first express the classifiers as the composition of the softmax function and a continuous function from [0,1]d[0,1]^d[0,1]d to RK\mathbb{R}^KRK. The main task is then to estimate the approximation error for the continuous function from [0,1]d[0,1]^d[0,1]d to RK\mathbb{R}^KRK. Finally, we focus on the curse of dimensionality based on the low-dimensional manifold assumption.

View on arXiv
Comments on this paper