22
0

Super-fast rates of convergence for Neural Networks Classifiers under the Hard Margin Condition

Abstract

We study the classical binary classification problem for hypothesis spaces of Deep Neural Networks (DNNs) with ReLU activation under Tsybakov's low-noise condition with exponent q>0q>0, and its limit-case qq\to\infty which we refer to as the "hard-margin condition". We show that DNNs which minimize the empirical risk with square loss surrogate and p\ell_p penalty can achieve finite-sample excess risk bounds of order O(nα)\mathcal{O}\left(n^{-\alpha}\right) for arbitrarily large α>0\alpha>0 under the hard-margin condition, provided that the regression function η\eta is sufficiently smooth. The proof relies on a novel decomposition of the excess risk which might be of independent interest.

View on arXiv
@article{tepakbong2025_2505.08262,
  title={ Super-fast rates of convergence for Neural Networks Classifiers under the Hard Margin Condition },
  author={ Nathanael Tepakbong and Ding-Xuan Zhou and Xiang Zhou },
  journal={arXiv preprint arXiv:2505.08262},
  year={ 2025 }
}
Comments on this paper