ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.19602
60
0

Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation

28 April 2025
Kitsuya Azuma
Takayuki Nishio
Yuichi Kitagawa
Wakako Nakano
Takahito Tanimura
    FedML
ArXivPDFHTML
Abstract

Federated Learning (FL) enables collaborative model training across decentralized clients, enhancing privacy by keeping data local. Yet conventional FL, relying on frequent parameter-sharing, suffers from high communication overhead and limited model heterogeneity. Distillation-based FL approaches address these issues by sharing predictions (soft-labels) instead, but they often involve redundant transmissions across communication rounds, reducing efficiency. We propose SCARLET, a novel framework integrating synchronized soft-label caching and an enhanced Entropy Reduction Aggregation (Enhanced ERA) mechanism. SCARLET minimizes redundant communication by reusing cached soft-labels, achieving up to 50% reduction in communication costs compared to existing methods while maintaining accuracy. Enhanced ERA can be tuned to adapt to non-IID data variations, ensuring robust aggregation and performance in diverse client scenarios. Experimental evaluations demonstrate that SCARLET consistently outperforms state-of-the-art distillation-based FL methods in terms of accuracy and communication efficiency. The implementation of SCARLET is publicly available atthis https URL.

View on arXiv
@article{azuma2025_2504.19602,
  title={ Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation },
  author={ Kitsuya Azuma and Takayuki Nishio and Yuichi Kitagawa and Wakako Nakano and Takahito Tanimura },
  journal={arXiv preprint arXiv:2504.19602},
  year={ 2025 }
}
Comments on this paper