ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1312.2315
38
22

Noisy Bayesian Active Learning

9 December 2013
Mohammad Naghshvar
T. Javidi
Kamalika Chaudhuri
    NoLa
ArXivPDFHTML
Abstract

We consider the problem of noisy Bayesian active learning, where we are given a finite set of functions H\mathcal{H}H, a sample space X\mathcal{X}X, and a label set L\mathcal{L}L. One of the functions in H\mathcal{H}H assigns labels to samples in X\mathcal{X}X. The goal is to identify the function that generates the labels even though the result of a label query on a sample is corrupted by independent noise. More precisely, the objective is to declare one of the functions in H\mathcal{H}H as the true label generating function with high confidence using as few label queries as possible, by selecting the queries adaptively and in a strategic manner. Previous work in Bayesian active learning considers Generalized Binary Search, and its variants for the noisy case, and analyzes the number of queries required by these sampling strategies. In this paper, we show that these schemes are, in general, suboptimal. Instead we propose and analyze an alternative strategy for sample collection. Our sampling strategy is motivated by a connection between Bayesian active learning and active hypothesis testing, and is based on querying the label of a sample which maximizes the Extrinsic Jensen-Shannon divergence at each step. We provide upper and lower bounds on the performance of this sampling strategy, and show that these bounds are better than previous bounds.

View on arXiv
Comments on this paper