In this work we study active learning of homogeneous -sparse halfspaces in under label noise. Even in the absence of label noise this is a challenging problem and only recently have label complexity bounds of the form been established in \citet{zhang2018efficient} for computationally efficient algorithms under the broad class of isotropic log-concave distributions. In contrast, under high levels of label noise, the label complexity bounds achieved by computationally efficient algorithms are much worse. When the label noise satisfies the {\em Massart} condition~\citep{massart2006risk}, i.e., each label is flipped with probability at most for a parameter , the work of \citet{awasthi2016learning} provides a computationally efficient active learning algorithm under isotropic log-concave distributions with label complexity . Hence the algorithm is label-efficient only when the noise rate is a constant. In this work, we substantially improve on the state of the art by designing a polynomial time algorithm for active learning of -sparse halfspaces under bounded noise and isotropic log-concave distributions, with a label complexity of . Hence, our new algorithm is label-efficient even for noise rates close to . Prior to our work, such a result was not known even for the random classification noise model. Our algorithm builds upon existing margin-based algorithmic framework and at each iteration performs a sequence of online mirror descent updates on a carefully chosen loss sequence, and uses a novel gradient update rule that accounts for the bounded noise.
View on arXiv