ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.15774
  4. Cited By
Decentralized Learning with Multi-Headed Distillation

Decentralized Learning with Multi-Headed Distillation

28 November 2022
A. Zhmoginov
Mark Sandler
Nolan Miller
Gus Kristiansen
Max Vladymyrov
    FedML
ArXivPDFHTML

Papers citing "Decentralized Learning with Multi-Headed Distillation"

6 / 6 papers shown
Title
Dataset Distillation by Automatic Training Trajectories
Dataset Distillation by Automatic Training Trajectories
Dai Liu
Jindong Gu
Hu Cao
Carsten Trinitis
Martin Schulz
DD
23
10
0
19 Jul 2024
Harnessing Increased Client Participation with Cohort-Parallel Federated Learning
Harnessing Increased Client Participation with Cohort-Parallel Federated Learning
Akash Dhasade
Anne-Marie Kermarrec
Tuan-Anh Nguyen
Rafael Pires
M. Vos
FedML
33
0
0
24 May 2024
DIMAT: Decentralized Iterative Merging-And-Training for Deep Learning
  Models
DIMAT: Decentralized Iterative Merging-And-Training for Deep Learning Models
Nastaran Saadati
Minh Pham
Nasla Saleem
Joshua R. Waite
Aditya Balu
Zhanhong Jiang
Chinmay Hegde
Soumik Sarkar
MoMe
35
1
0
11 Apr 2024
FedProto: Federated Prototype Learning across Heterogeneous Clients
FedProto: Federated Prototype Learning across Heterogeneous Clients
Yue Tan
Guodong Long
Lu Liu
Tianyi Zhou
Qinghua Lu
Jing Jiang
Chengqi Zhang
FedML
151
459
0
01 May 2021
Towards Personalized Federated Learning
Towards Personalized Federated Learning
A. Tan
Han Yu
Li-zhen Cui
Qiang Yang
FedML
AI4CE
183
840
0
01 Mar 2021
Survey of Personalization Techniques for Federated Learning
Survey of Personalization Techniques for Federated Learning
V. Kulkarni
Milind Kulkarni
Aniruddha Pant
FedML
171
324
0
19 Mar 2020
1