Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning
- FedML
Main:6 Pages
1 Figures
Bibliography:3 Pages
1 Tables
Abstract
Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an -suboptimal solution in communication rounds, requiring a total of stochastic subgradient oracle calls.
View on arXivComments on this paper
