ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.07295
  4. Cited By
Training Heterogeneous Client Models using Knowledge Distillation in
  Serverless Federated Learning

Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning

11 February 2024
Mohak Chadha
Pulkit Khera
Jianfeng Gu
Osama Abboud
Michael Gerndt
    FedML
ArXivPDFHTML

Papers citing "Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning"

4 / 4 papers shown
Title
Papaya: Practical, Private, and Scalable Federated Learning
Papaya: Practical, Private, and Scalable Federated Learning
Dzmitry Huba
John Nguyen
Kshitiz Malik
Ruiyu Zhu
Michael G. Rabbat
...
H. Srinivas
Kaikai Wang
Anthony Shoumikhin
Jesik Min
Mani Malek
FedML
99
135
0
08 Nov 2021
FedLess: Secure and Scalable Federated Learning Using Serverless
  Computing
FedLess: Secure and Scalable Federated Learning Using Serverless Computing
Andreas Grafberger
Mohak Chadha
Anshul Jindal
Jianfeng Gu
Michael Gerndt
30
49
0
05 Nov 2021
IBM Federated Learning: an Enterprise Framework White Paper V0.1
IBM Federated Learning: an Enterprise Framework White Paper V0.1
Heiko Ludwig
Nathalie Baracaldo
Gegi Thomas
Yi Zhou
Ali Anwar
...
Sean Laguna
Mikhail Yurochkin
Mayank Agarwal
Ebube Chuba
Annie Abay
FedML
124
156
0
22 Jul 2020
funcX: A Federated Function Serving Fabric for Science
funcX: A Federated Function Serving Fabric for Science
Ryan Chard
Y. Babuji
Zhuozhao Li
Tyler J. Skluzacek
A. Woodard
B. Blaiszik
Ian T. Foster
Kyle Chard
33
187
0
07 May 2020
1