ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.07948
  4. Cited By
Adaptive Distillation for Decentralized Learning from Heterogeneous
  Clients

Adaptive Distillation for Decentralized Learning from Heterogeneous Clients

18 August 2020
Jiaxin Ma
Ryo Yonetani
Z. Iqbal
    FedML
ArXivPDFHTML

Papers citing "Adaptive Distillation for Decentralized Learning from Heterogeneous Clients"

4 / 4 papers shown
Title
Decentralized Learning with Multi-Headed Distillation
Decentralized Learning with Multi-Headed Distillation
A. Zhmoginov
Mark Sandler
Nolan Miller
Gus Kristiansen
Max Vladymyrov
FedML
40
4
0
28 Nov 2022
Personalized Federated Learning for Heterogeneous Clients with Clustered
  Knowledge Transfer
Personalized Federated Learning for Heterogeneous Clients with Clustered Knowledge Transfer
Yae Jee Cho
Jianyu Wang
Tarun Chiruvolu
Gauri Joshi
FedML
35
31
0
16 Sep 2021
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
Mean teachers are better role models: Weight-averaged consistency
  targets improve semi-supervised deep learning results
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
267
1,275
0
06 Mar 2017
1