56
450

Fast learning rates for plug-in classifiers

Abstract

It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n1/2n^{-1/2}. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n1n^{-1}, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n1n^{-1}. We establish minimax lower bounds showing that the obtained rates cannot be improved.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.