Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2211.15774
Cited By
Decentralized Learning with Multi-Headed Distillation
28 November 2022
A. Zhmoginov
Mark Sandler
Nolan Miller
Gus Kristiansen
Max Vladymyrov
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Decentralized Learning with Multi-Headed Distillation"
6 / 6 papers shown
Title
Dataset Distillation by Automatic Training Trajectories
Dai Liu
Jindong Gu
Hu Cao
Carsten Trinitis
Martin Schulz
DD
23
10
0
19 Jul 2024
Harnessing Increased Client Participation with Cohort-Parallel Federated Learning
Akash Dhasade
Anne-Marie Kermarrec
Tuan-Anh Nguyen
Rafael Pires
M. Vos
FedML
33
0
0
24 May 2024
DIMAT: Decentralized Iterative Merging-And-Training for Deep Learning Models
Nastaran Saadati
Minh Pham
Nasla Saleem
Joshua R. Waite
Aditya Balu
Zhanhong Jiang
Chinmay Hegde
Soumik Sarkar
MoMe
35
1
0
11 Apr 2024
FedProto: Federated Prototype Learning across Heterogeneous Clients
Yue Tan
Guodong Long
Lu Liu
Tianyi Zhou
Qinghua Lu
Jing Jiang
Chengqi Zhang
FedML
151
455
0
01 May 2021
Towards Personalized Federated Learning
A. Tan
Han Yu
Li-zhen Cui
Qiang Yang
FedML
AI4CE
183
840
0
01 Mar 2021
Survey of Personalization Techniques for Federated Learning
V. Kulkarni
Milind Kulkarni
Aniruddha Pant
FedML
171
324
0
19 Mar 2020
1