ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.02540
29
0

Lazy But Effective: Collaborative Personalized Federated Learning with Heterogeneous Data

5 May 2025
Ljubomir Rokvic
Panayiotis Danassis
Boi Faltings
    FedML
ArXivPDFHTML
Abstract

In Federated Learning, heterogeneity in client data distributions often means that a single global model does not have the best performance for individual clients. Consider for example training a next-word prediction model for keyboards: user-specific language patterns due to demographics (dialect, age, etc.), language proficiency, and writing style result in a highly non-IID dataset across clients. Other examples are medical images taken with different machines, or driving data from different vehicle types. To address this, we propose a simple yet effective personalized federated learning framework (pFedLIA) that utilizes a computationally efficient influence approximation, called `Lazy Influence', to cluster clients in a distributed manner before model aggregation. Within each cluster, data owners collaborate to jointly train a model that captures the specific data patterns of the clients. Our method has been shown to successfully recover the global model's performance drop due to the non-IID-ness in various synthetic and real-world settings, specifically a next-word prediction task on the Nordic languages as well as several benchmark tasks. It matches the performance of a hypothetical Oracle clustering, and significantly improves on existing baselines, e.g., an improvement of 17% on CIFAR100.

View on arXiv
@article{rokvic2025_2505.02540,
  title={ Lazy But Effective: Collaborative Personalized Federated Learning with Heterogeneous Data },
  author={ Ljubomir Rokvic and Panayiotis Danassis and Boi Faltings },
  journal={arXiv preprint arXiv:2505.02540},
  year={ 2025 }
}
Comments on this paper