11
26

Non-Convex SGD Learns Halfspaces with Adversarial Label Noise

Abstract

We study the problem of agnostically learning homogeneous halfspaces in the distribution-specific PAC model. For a broad family of structured distributions, including log-concave distributions, we show that non-convex SGD efficiently converges to a solution with misclassification error O(\opt)+\epsO(\opt)+\eps, where \opt\opt is the misclassification error of the best-fitting halfspace. In sharp contrast, we show that optimizing any convex surrogate inherently leads to misclassification error of ω(\opt)\omega(\opt), even under Gaussian marginals.

View on arXiv
Comments on this paper