ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.10598
19
0

Beyond Worst-Case Online Classification: VC-Based Regret Bounds for Relaxed Benchmarks

14 April 2025
Omar Montasser
Abhishek Shetty
Nikita Zhivotovskiy
ArXivPDFHTML
Abstract

We revisit online binary classification by shifting the focus from competing with the best-in-class binary loss to competing against relaxed benchmarks that capture smoothed notions of optimality. Instead of measuring regret relative to the exact minimal binary error -- a standard approach that leads to worst-case bounds tied to the Littlestone dimension -- we consider comparing with predictors that are robust to small input perturbations, perform well under Gaussian smoothing, or maintain a prescribed output margin. Previous examples of this were primarily limited to the hinge loss. Our algorithms achieve regret guarantees that depend only on the VC dimension and the complexity of the instance space (e.g., metric entropy), and notably, they incur only an O(log⁡(1/γ))O(\log(1/\gamma))O(log(1/γ)) dependence on the generalized margin γ\gammaγ. This stands in contrast to most existing regret bounds, which typically exhibit a polynomial dependence on 1/γ1/\gamma1/γ. We complement this with matching lower bounds. Our analysis connects recent ideas from adversarial robustness and smoothed online learning.

View on arXiv
@article{montasser2025_2504.10598,
  title={ Beyond Worst-Case Online Classification: VC-Based Regret Bounds for Relaxed Benchmarks },
  author={ Omar Montasser and Abhishek Shetty and Nikita Zhivotovskiy },
  journal={arXiv preprint arXiv:2504.10598},
  year={ 2025 }
}
Comments on this paper