Minimax learning rates for estimating binary classifiers under margin conditions

We study classification problems using binary estimators where the decision boundary is described by horizon functions and where the data distribution satisfies a geometric margin condition. We establish upper and lower bounds for the minimax learning rate over broad function classes with bounded Kolmogorov entropy in Lebesgue norms. A key novelty of our work is the derivation of lower bounds on the worst-case learning rates under a geometric margin condition -- a setting that is almost universally satisfied in practice but remains theoretically challenging. Moreover, our results deal with the noiseless setting, where lower bounds are particularly hard to establish. We apply our general results to classification problems with decision boundaries belonging to several function classes: for Barron-regular functions, and for Hölder-continuous functions with strong margins, we identify optimal rates close to the fast learning rates of for samples. Also for merely convex decision boundaries, in a strong margin case optimal rates near can be achieved.
View on arXiv@article{garcía2025_2505.10628, title={ Minimax learning rates for estimating binary classifiers under margin conditions }, author={ Jonathan García and Philipp Petersen }, journal={arXiv preprint arXiv:2505.10628}, year={ 2025 } }