30
0

Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning

Abstract

Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an ϵ\epsilon-suboptimal solution in O(1/ϵ)\mathcal{O}(1/\epsilon) communication rounds, requiring a total of O(1/ϵ2)\mathcal{O}(1/\epsilon^2) stochastic subgradient oracle calls.

View on arXiv
@article{palenzuela2025_2503.21627,
  title={ Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning },
  author={ Karlo Palenzuela and Ali Dadras and Alp Yurtsever and Tommy Löfstedt },
  journal={arXiv preprint arXiv:2503.21627},
  year={ 2025 }
}
Comments on this paper