A Near-optimal Algorithm for Learning Margin Halfspaces with Massart Noise

Abstract
We study the problem of PAC learning -margin halfspaces in the presence of Massart noise. Without computational considerations, the sample complexity of this learning problem is known to be . Prior computationally efficient algorithms for the problem incur sample complexity and achieve 0-1 error of , where is the upper bound on the noise rate. Recent work gave evidence of an information-computation tradeoff, suggesting that a quadratic dependence on is required for computationally efficient algorithms. Our main result is a computationally efficient learner with sample complexity , nearly matching this lower bound. In addition, our algorithm is simple and practical, relying on online SGD on a carefully selected sequence of convex losses.
View on arXivComments on this paper