Randomized Block-Coordinate Optimistic Gradient Algorithms for
Root-Finding Problems
In this paper, we develop two new randomized block-coordinate optimistic gradient algorithms to approximate a solution of nonlinear equations, which are called root-finding problems. Our first algorithm is non-accelerated with constant stepsizes, and achieves best-iterate convergence rate on when the underlying operator is Lipschitz continuous and the equation admits a weak Minty solution, where is the expectation and is the iteration counter. Our second method is a new accelerated randomized block-coordinate optimistic gradient algorithm. We establish both and last-iterate convergence rates on both and for this algorithm under the co-coerciveness of . Then, we apply our methods to a class of finite-sum nonlinear inclusions which covers various applications in machine learning and statistical learning, especially in federated learning and network optimization. We obtain two new federated learning-type algorithms for this problem class with rigorous convergence rate guarantees.
View on arXiv