ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.02476
26
1

Online Convex Optimization with a Separation Oracle

3 October 2024
Zakaria Mhammedi
ArXivPDFHTML
Abstract

In this paper, we introduce a new projection-free algorithm for Online Convex Optimization (OCO) with a state-of-the-art regret guarantee among separation-based algorithms. Existing projection-free methods based on the classical Frank-Wolfe algorithm achieve a suboptimal regret bound of O(T3/4)O(T^{3/4})O(T3/4), while more recent separation-based approaches guarantee a regret bound of O(κT)O(\kappa \sqrt{T})O(κT​), where κ\kappaκ denotes the asphericity of the feasible set, defined as the ratio of the radii of the containing and contained balls. However, for ill-conditioned sets, κ\kappaκ can be arbitrarily large, potentially leading to poor performance. Our algorithm achieves a regret bound of O~(dT+κd)\widetilde{O}(\sqrt{dT} + \kappa d)O(dT​+κd), while requiring only O~(1)\widetilde{O}(1)O(1) calls to a separation oracle per round. Crucially, the main term in the bound, O~(dT)\widetilde{O}(\sqrt{d T})O(dT​), is independent of κ\kappaκ, addressing the limitations of previous methods. Additionally, as a by-product of our analysis, we recover the O(κT)O(\kappa \sqrt{T})O(κT​) regret bound of existing OCO algorithms with a more straightforward analysis and improve the regret bound for projection-free online exp-concave optimization. Finally, for constrained stochastic convex optimization, we achieve a state-of-the-art convergence rate of O~(σ/T+κd/T)\widetilde{O}(\sigma/\sqrt{T} + \kappa d/T)O(σ/T​+κd/T), where σ\sigmaσ represents the noise in the stochastic gradients, while requiring only O~(1)\widetilde{O}(1)O(1) calls to a separation oracle per iteration.

View on arXiv
Comments on this paper