Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.19600
Cited By
Adaptive Self-Distillation for Minimizing Client Drift in Heterogeneous Federated Learning
31 May 2023
M.Yashwanth
Gaurav Kumar Nayak
Aryaveer Singh
Yogesh Singh
Anirban Chakraborty
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Adaptive Self-Distillation for Minimizing Client Drift in Heterogeneous Federated Learning"
4 / 4 papers shown
Title
FedSpeed: Larger Local Interval, Less Communication Round, and Higher Generalization Accuracy
Yan Sun
Li Shen
Tiansheng Huang
Liang Ding
Dacheng Tao
FedML
29
50
0
21 Feb 2023
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
167
410
0
14 Jul 2021
The Future of Digital Health with Federated Learning
Nicola Rieke
Jonny Hancox
Wenqi Li
Fausto Milletari
H. Roth
...
Ronald M. Summers
Andrew Trask
Daguang Xu
Maximilian Baust
M. Jorge Cardoso
OOD
174
1,690
0
18 Mar 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
1