Personalized Federated Learning under Model Dissimilarity Constraints

One of the defining challenges in federated learning is that of statistical heterogeneity among clients. We address this problem with KARULA, a regularized strategy for personalized federated learning, which constrains the pairwise model dissimilarities between clients based on the difference in their distributions, as measured by a surrogate for the 1-Wasserstein distance adapted for the federated setting. This allows the strategy to adapt to highly complex interrelations between clients, that e.g., clustered approaches fail to capture. We propose an inexact projected stochastic gradient algorithm to solve the constrained problem that the strategy defines, and show theoretically that it converges with smooth, possibly non-convex losses to a neighborhood of a stationary point with rate O(1/K). We demonstrate the effectiveness of KARULA on synthetic and real federated data sets.
View on arXiv@article{erickson2025_2505.07575, title={ Personalized Federated Learning under Model Dissimilarity Constraints }, author={ Samuel Erickson and Mikael Johansson }, journal={arXiv preprint arXiv:2505.07575}, year={ 2025 } }