Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning

Abstract
Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an -suboptimal solution in communication rounds, requiring a total of stochastic subgradient oracle calls.
View on arXiv@article{palenzuela2025_2503.21627, title={ Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning }, author={ Karlo Palenzuela and Ali Dadras and Alp Yurtsever and Tommy Löfstedt }, journal={arXiv preprint arXiv:2503.21627}, year={ 2025 } }
Comments on this paper