Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2211.04742
Cited By
Knowledge Distillation for Federated Learning: a Practical Guide
9 November 2022
Alessio Mora
Irene Tenison
Paolo Bellavista
Irina Rish
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation for Federated Learning: a Practical Guide"
8 / 8 papers shown
Title
FedQUIT: On-Device Federated Unlearning via a Quasi-Competent Virtual Teacher
Alessio Mora
Lorenzo Valerio
Paolo Bellavista
A. Passarella
FedML
MU
42
2
0
14 Aug 2024
Federated Learning for 6G: Paradigms, Taxonomy, Recent Advances and Insights
Maryam Ben Driss
Essaid Sabir
H. Elbiaze
Walid Saad
28
7
0
07 Dec 2023
A Survey of What to Share in Federated Learning: Perspectives on Model Utility, Privacy Leakage, and Communication Efficiency
Jiawei Shao
Zijian Li
Wenqiang Sun
Tailin Zhou
Yuchang Sun
Lumin Liu
Zehong Lin
Yuyi Mao
Jun Zhang
FedML
26
22
0
20 Jul 2023
Knowledge Distillation in Federated Edge Learning: A Survey
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
FedML
27
4
0
14 Jan 2023
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
173
411
0
14 Jul 2021
Federated Learning on Non-IID Data Silos: An Experimental Study
Q. Li
Yiqun Diao
Quan Chen
Bingsheng He
FedML
OOD
87
943
0
03 Feb 2021
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
157
758
0
28 Sep 2019
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
267
404
0
09 Apr 2018
1