213
v1v2 (latest)

Gradient Methods with Online Scaling Part II. Practical Aspects

Main:18 Pages
7 Figures
Bibliography:3 Pages
1 Tables
Appendix:19 Pages
Abstract

Part I of this work [Gao25] establishes online scaled gradient methods (OSGM), a framework that utilizes online convex optimization to adapt stepsizes in gradient methods. This paper focuses on the practical aspects of OSGM. We leverage the OSGM framework to design new adaptive first-order methods and provide insights into their empirical behavior. The resulting method, OSGM-Best, matches the performance of quasi-Newton variants while requiring less memory and cheaper iterations. We also extend OSGM to nonconvex optimization and outline directions that connect OSGM to existing branches of optimization theory and practice.

View on arXiv
Comments on this paper