14

Step-Size Stability in Stochastic Optimization: A Theoretical Perspective

Fabian Schaipp
Robert M. Gower
Adrien Taylor
Main:16 Pages
14 Figures
Bibliography:4 Pages
1 Tables
Appendix:9 Pages
Abstract

We present a theoretical analysis of stochastic optimization methods in terms of their sensitivity with respect to the step size. We identify a key quantity that, for each method, describes how the performance degrades as the step size becomes too large. For convex problems, we show that this quantity directly impacts the suboptimality bound of the method. Most importantly, our analysis provides direct theoretical evidence that adaptive step-size methods, such as SPS or NGN, are more robust than SGD. This allows us to quantify the advantage of these adaptive methods beyond empirical evaluation. Finally, we show through experiments that our theoretical bound qualitatively mirrors the actual performance as a function of the step size, even for nonconvex problems.

View on arXiv
Comments on this paper