ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.11151
46
0

Enabling Weak Client Participation via On-device Knowledge Distillation in Heterogenous Federated Learning

14 March 2025
Jihyun Lim
Junhyuk Jo
Tuo Zhang
Salman Avestimehr
Sunwoo Lee
    FedML
ArXivPDFHTML
Abstract

Online Knowledge Distillation (KD) is recently highlighted to train large models in Federated Learning (FL) environments. Many existing studies adopt the logit ensemble method to perform KD on the server side. However, they often assume that unlabeled data collected at the edge is centralized on the server. Moreover, the logit ensemble method personalizes local models, which can degrade the quality of soft targets, especially when data is highly non-IID. To address these critical limitations,we propose a novel on-device KD-based heterogeneous FL method. Our approach leverages a small auxiliary model to learn from labeled local data. Subsequently, a subset of clients with strong system resources transfers knowledge to a large model through on-device KD using their unlabeled data. Our extensive experiments demonstrate that our on-device KD-based heterogeneous FL method effectively utilizes the system resources of all edge devices as well as the unlabeled data, resulting in higher accuracy compared to SOTA KD-based FL methods.

View on arXiv
@article{lim2025_2503.11151,
  title={ Enabling Weak Client Participation via On-device Knowledge Distillation in Heterogenous Federated Learning },
  author={ Jihyun Lim and Junhyuk Jo and Tuo Zhang and Salman Avestimehr and Sunwoo Lee },
  journal={arXiv preprint arXiv:2503.11151},
  year={ 2025 }
}
Comments on this paper