ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1805.02350
85
36
v1v2 (latest)

Efficient active learning of sparse halfspaces

7 May 2018
Chicheng Zhang
ArXiv (abs)PDFHTML
Abstract

We study the problem of efficient PAC active learning of homogeneous linear classifiers (halfspaces) in Rd\mathbb{R}^dRd, where the goal is to learn a halfspace with low error using as few label queries as possible. Under the extra assumption that there is a ttt-sparse halfspace that performs well on the data (t≪dt \ll dt≪d), we would like our active learning algorithm to be {\em attribute efficient}, i.e. to have label requirements sublinear in ddd. In this paper, we provide a computationally efficient algorithm that achieves this goal. Under certain distributional assumptions on the data, our algorithm achieves a label complexity of O(t⋅polylog(d,1ϵ))O(t \cdot \mathrm{polylog}(d, \frac 1 \epsilon))O(t⋅polylog(d,ϵ1​)). In contrast, existing algorithms in this setting are either computationally inefficient, or subject to label requirements polynomial in ddd or 1ϵ\frac 1 \epsilonϵ1​.

View on arXiv
Comments on this paper