204

Parameter-free Algorithms for the Stochastically Extended Adversarial Model

Main:10 Pages
Bibliography:3 Pages
2 Tables
Appendix:15 Pages
Abstract

We develop the first parameter-free algorithms for the Stochastically Extended Adversarial (SEA) model, a framework that bridges adversarial and stochastic online convex optimization. Existing approaches for the SEA model require prior knowledge of problem-specific parameters, such as the diameter of the domain DD and the Lipschitz constant of the loss functions GG, which limits their practical applicability. Addressing this, we develop parameter-free methods by leveraging the Optimistic Online Newton Step (OONS) algorithm to eliminate the need for these parameters. We first establish a comparator-adaptive algorithm for the scenario with unknown domain diameter but known Lipschitz constant, achieving an expected regret bound of O~(u22+u2(σ1:T2+Σ1:T2))\tilde{O}\big(\|u\|_2^2 + \|u\|_2(\sqrt{\sigma^2_{1:T}} + \sqrt{\Sigma^2_{1:T}})\big), where uu is the comparator vector and σ1:T2\sigma^2_{1:T} and Σ1:T2\Sigma^2_{1:T} represent the cumulative stochastic variance and cumulative adversarial variation, respectively. We then extend this to the more general setting where both DD and GG are unknown, attaining the comparator- and Lipschitz-adaptive algorithm. Notably, the regret bound exhibits the same dependence on σ1:T2\sigma^2_{1:T} and Σ1:T2\Sigma^2_{1:T}, demonstrating the efficacy of our proposed methods even when both parameters are unknown in the SEA model.

View on arXiv
Comments on this paper