648

Randomized Block-Coordinate Optimistic Gradient Algorithms for Root-Finding Problems

Mathematics of Operations Research (MOR), 2023
Abstract

In this paper, we develop two new randomized block-coordinate optimistic gradient algorithms to approximate a solution of nonlinear equations, which are called root-finding problems. Our first algorithm is non-accelerated with constant stepsizes, and achieves O(1/k)\mathcal{O}(1/k) best-iterate convergence rate on E[Gxk2]\mathbb{E}[ \Vert Gx^k\Vert^2] when the underlying operator GG is Lipschitz continuous and the equation Gx=0Gx = 0 admits a weak Minty solution, where E[]\mathbb{E}[\cdot] is the expectation and kk is the iteration counter. Our second method is a new accelerated randomized block-coordinate optimistic gradient algorithm. We establish both O(1/k2)\mathcal{O}(1/k^2) and o(1/k2)o(1/k^2) last-iterate convergence rates on both E[Gxk2]\mathbb{E}[ \Vert Gx^k\Vert^2] and E[xk+1xk2]\mathbb{E}[ \Vert x^{k+1} - x^{k}\Vert^2] for this algorithm under the co-coerciveness of GG. Then, we apply our methods to a class of finite-sum nonlinear inclusions which covers various applications in machine learning and statistical learning, especially in federated learning and network optimization. We obtain two new federated learning-type algorithms for this problem class with rigorous convergence rate guarantees.

View on arXiv
Comments on this paper