64
1

Randomized Coordinate Subgradient Method for Nonsmooth Optimization

Abstract

In this work, we propose the {Randomized Coordinate Subgradient method} (RCS) for solving nonsmooth convex and nonsmooth nonconvex (nonsmooth weakly convex) optimization problems. RCS randomly selects one block coordinate to update at each iteration, making it more practical than updating all coordinates. We consider the linearly bounded subgradients assumption for the objective function, which is more general than the traditional Lipschitz continuity assumption, to account for practical scenarios. We then conduct thorough convergence analysis for RCS in both convex and nonconvex cases based on this generalized Lipschitz-type assumption. Specifically, we establish the O~(1/k)\widetilde{\mathcal{O}}(1/\sqrt{k}) convergence rate in expectation and the o~(1/k)\tilde o(1/\sqrt{k}) almost sure asymptotic convergence rate in terms of suboptimality gap when ff is nonsmooth convex. If ff further satisfies the global quadratic growth condition, the improved O(1/k)\mathcal{O}(1/k) rate is shown in terms of the squared distance to the optimal solution set. For the case when ff is nonsmooth weakly convex and its subdifferential satisfies the global metric subregularity property, we derive the O(1/T1/4)\mathcal{O}(1/T^{1/4}) iteration complexity in expectation, where TT is the total number of iterations. We also establish an asymptotic convergence result. To justify the global metric subregularity property utilized in the analysis, we establish this error bound condition for the concrete (real valued) robust phase retrieval problem, which is of independent interest. We provide a convergence lemma and the relationship between the global metric subregularity properties of a weakly convex function and its Moreau envelope, which are also of independent interest. Finally, we conduct several experiments to demonstrate the possible superiority of RCS over the subgradient method.

View on arXiv
Comments on this paper