ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.11408
66
1

Federated Domain Generalization with Label Smoothing and Balanced Decentralized Training

16 December 2024
Milad Soltany
Farhad Pourpanah
Mahdiyar Molahasani
Michael A. Greenspan
Ali Etemad
    FedML
ArXivPDFHTML
Abstract

In this paper, we propose a novel approach, Federated Domain Generalization with Label Smoothing and Balanced Decentralized Training (FedSB), to address the challenges of data heterogeneity within a federated learning framework. FedSB utilizes label smoothing at the client level to prevent overfitting to domain-specific features, thereby enhancing generalization capabilities across diverse domains when aggregating local models into a global model. Additionally, FedSB incorporates a decentralized budgeting mechanism which balances training among clients, which is shown to improve the performance of the aggregated global model. Extensive experiments on four commonly used multi-domain datasets, PACS, VLCS, OfficeHome, and TerraInc, demonstrate that FedSB outperforms competing methods, achieving state-of-the-art results on three out of four datasets, indicating the effectiveness of FedSB in addressing data heterogeneity.

View on arXiv
@article{soltany2025_2412.11408,
  title={ Federated Domain Generalization with Label Smoothing and Balanced Decentralized Training },
  author={ Milad Soltany and Farhad Pourpanah and Mahdiyar Molahasani and Michael Greenspan and Ali Etemad },
  journal={arXiv preprint arXiv:2412.11408},
  year={ 2025 }
}
Comments on this paper