Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2008.07948
Cited By
Adaptive Distillation for Decentralized Learning from Heterogeneous Clients
18 August 2020
Jiaxin Ma
Ryo Yonetani
Z. Iqbal
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Adaptive Distillation for Decentralized Learning from Heterogeneous Clients"
4 / 4 papers shown
Title
Decentralized Learning with Multi-Headed Distillation
A. Zhmoginov
Mark Sandler
Nolan Miller
Gus Kristiansen
Max Vladymyrov
FedML
40
4
0
28 Nov 2022
Personalized Federated Learning for Heterogeneous Clients with Clustered Knowledge Transfer
Yae Jee Cho
Jianyu Wang
Tarun Chiruvolu
Gauri Joshi
FedML
35
31
0
16 Sep 2021
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
267
1,275
0
06 Mar 2017
1