ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.13378
  4. Cited By
FedCache 2.0: Exploiting the Potential of Distilled Data in Knowledge
  Cache-driven Federated Learning

FedCache 2.0: Exploiting the Potential of Distilled Data in Knowledge Cache-driven Federated Learning

22 May 2024
Quyang Pan
Sheng Sun
Zhiyuan Wu
Yuwei Wang
Min Liu
Bo Gao
    FedML
ArXivPDFHTML

Papers citing "FedCache 2.0: Exploiting the Potential of Distilled Data in Knowledge Cache-driven Federated Learning"

3 / 3 papers shown
Title
Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation
Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation
Kitsuya Azuma
Takayuki Nishio
Yuichi Kitagawa
Wakako Nakano
Takahito Tanimura
FedML
58
0
0
28 Apr 2025
Towards Personalized Federated Learning
Towards Personalized Federated Learning
A. Tan
Han Yu
Li-zhen Cui
Qiang Yang
FedML
AI4CE
180
832
0
01 Mar 2021
FedML: A Research Library and Benchmark for Federated Machine Learning
FedML: A Research Library and Benchmark for Federated Machine Learning
Chaoyang He
Songze Li
Jinhyun So
Xiao Zeng
Mi Zhang
...
Yang Liu
Ramesh Raskar
Qiang Yang
M. Annavaram
Salman Avestimehr
FedML
156
553
0
27 Jul 2020
1