145

Optimal rates for F-score binary classification

Abstract

We study the minimax settings of binary classification with F-score under the β\beta-smoothness assumptions on the regression function η(x)=P(Y=1X=x)\eta(x) = \mathbb{P}(Y = 1|X = x) for xRdx \in \mathbb{R}^d. We propose a classification procedure which under the α\alpha-margin assumption achieves the rate O(n(1+α)β/(2β+d))O(n^{--(1+\alpha)\beta/(2\beta+d)}) for the excess F-score. In this context, the Bayes optimal classifier for the F-score can be obtained by thresholding the aforementioned regression function η\eta on some level θ\theta^* to be estimated. The proposed procedure is performed in a semi-supervised manner, that is, for the estimation of the regression function we use a labeled dataset of size nNn \in \mathbb{N} and for the estimation of the optimal threshold θ\theta^* we use an unlabeled dataset of size NNN \in \mathbb{N}. Interestingly, the value of NNN \in \mathbb{N} does not affect the rate of convergence, which indicates that it is "harder" to estimate the regression function η\eta than the optimal threshold θ\theta^*. This further implies that the binary classification with F-score behaves similarly to the standard settings of binary classification. Finally, we show that the rates achieved by the proposed procedure are optimal in the minimax sense up to a constant factor.

View on arXiv
Comments on this paper