41
0

Near-Polynomially Competitive Active Logistic Regression

Abstract

We address the problem of active logistic regression in the realizable setting. It is well known that active learning can require exponentially fewer label queries compared to passive learning, in some cases using log1\eps\log \frac{1}{\eps} rather than \poly(1/\eps)\poly(1/\eps) labels to get error \eps\eps larger than the optimum.We present the first algorithm that is polynomially competitive with the optimal algorithm on every input instance, up to factors polylogarithmic in the error and domain size. In particular, if any algorithm achieves label complexity polylogarithmic in \eps\eps, so does ours. Our algorithm is based on efficient sampling and can be extended to learn more general class of functions. We further support our theoretical results with experiments demonstrating performance gains for logistic regression compared to existing active learning algorithms.

View on arXiv
@article{zhou2025_2503.05981,
  title={ Near-Polynomially Competitive Active Logistic Regression },
  author={ Yihan Zhou and Eric Price and Trung Nguyen },
  journal={arXiv preprint arXiv:2503.05981},
  year={ 2025 }
}
Comments on this paper