244

Constrained Convex Neyman-Pearson Classification Using an Outer Approximation Splitting Method

IEEE Transactions on Signal Processing (IEEE TSP), 2015
Abstract

We propose an efficient splitting algorithm for solving Neyman-Pearson classification problems, which consist in minimizing the type II risk subject to an upper bound constraint on the type I risk. Since the 1/0 loss function is not convex, it is customary to replace it by convex surrogates that lead to manageable optimization problems. While statistical bounds have been be derived to quantify the cost of using such surrogates, no specific algorithm has yet been proposed to solve exactly the resulting constrained minimization problem and existing work has addressed only Langragian approximations. The contribution of this paper is to propose an efficient splitting algorithm to address this issue. Our method alternates a gradient step on the objective and a projection step onto the lower level set modeling the constraint. The projection step is implemented via an outer approximation scheme in which the constraint set is approximated by a sequence of simple convex sets consisting of the intersection of two half-spaces. Convergence of the iterates generated by the algorithm is established. Experiments on both synthetic and biological data show that our algorithm outperforms state of the art Lagrangian methods such as ν\nu-SVM.

View on arXiv
Comments on this paper