Stochastic Weakly Convex Optimization Beyond Lipschitz Continuity
International Conference on Machine Learning (ICML), 2024
Wenzhi Gao
Abstract
This paper considers stochastic weakly convex optimization without the standard Lipschitz continuity assumption. Based on new adaptive regularization (stepsize) strategies, we show that a wide class of stochastic algorithms, including the stochastic subgradient method, preserve the convergence rate with constant failure rate. Our analyses rest on rather weak assumptions: the Lipschitz parameter can be either bounded by a general growth function of or locally estimated through independent random samples.
View on arXivComments on this paper
