297

Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning

Main:6 Pages
1 Figures
Bibliography:3 Pages
1 Tables
Abstract

Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an ϵ\epsilon-suboptimal solution in O(1/ϵ)\mathcal{O}(1/\epsilon) communication rounds, requiring a total of O(1/ϵ2)\mathcal{O}(1/\epsilon^2) stochastic subgradient oracle calls.

View on arXiv
Comments on this paper