ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.09104
74
1

One-shot Federated Learning Methods: A Practical Guide

13 February 2025
Xiang Liu
Zhenheng Tang
Xia Li
Yijun Song
Sijie Ji
Zemin Liu
Bo Han
Linshan Jiang
Jialin Li
    FedML
ArXivPDFHTML
Abstract

One-shot Federated Learning (OFL) is a distributed machine learning paradigm that constrains client-server communication to a single round, addressing privacy and communication overhead issues associated with multiple rounds of data exchange in traditional Federated Learning (FL). OFL demonstrates the practical potential for integration with future approaches that require collaborative training models, such as large language models (LLMs). However, current OFL methods face two major challenges: data heterogeneity and model heterogeneity, which result in subpar performance compared to conventional FL methods. Worse still, despite numerous studies addressing these limitations, a comprehensive summary is still lacking. To address these gaps, this paper presents a systematic analysis of the challenges faced by OFL and thoroughly reviews the current methods. We also offer an innovative categorization method and analyze the trade-offs of various techniques. Additionally, we discuss the most promising future directions and the technologies that should be integrated into the OFL field. This work aims to provide guidance and insights for future research.

View on arXiv
@article{liu2025_2502.09104,
  title={ One-shot Federated Learning Methods: A Practical Guide },
  author={ Xiang Liu and Zhenheng Tang and Xia Li and Yijun Song and Sijie Ji and Zemin Liu and Bo Han and Linshan Jiang and Jialin Li },
  journal={arXiv preprint arXiv:2502.09104},
  year={ 2025 }
}
Comments on this paper