ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.05849
  4. Cited By
Knowledge Distillation in Federated Edge Learning: A Survey

Knowledge Distillation in Federated Edge Learning: A Survey

14 January 2023
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
    FedML
ArXivPDFHTML

Papers citing "Knowledge Distillation in Federated Edge Learning: A Survey"

5 / 5 papers shown
Title
Data-Free Black-Box Federated Learning via Zeroth-Order Gradient Estimation
Xinge Ma
Jin Wang
Xuejie Zhang
FedML
60
0
0
08 Mar 2025
Logits Poisoning Attack in Federated Distillation
Logits Poisoning Attack in Federated Distillation
Yuhan Tang
Zhiyuan Wu
Bo Gao
Tian Wen
Yuwei Wang
Sheng Sun
FedML
AAML
18
1
0
08 Jan 2024
Federated Skewed Label Learning with Logits Fusion
Federated Skewed Label Learning with Logits Fusion
Yuwei Wang
Runhan Li
Hao Tan
Xue Jiang
Sheng Sun
Min Liu
Bo Gao
Zhiyuan Wu
FedML
17
5
0
14 Nov 2023
A Survey of What to Share in Federated Learning: Perspectives on Model
  Utility, Privacy Leakage, and Communication Efficiency
A Survey of What to Share in Federated Learning: Perspectives on Model Utility, Privacy Leakage, and Communication Efficiency
Jiawei Shao
Zijian Li
Wenqiang Sun
Tailin Zhou
Yuchang Sun
Lumin Liu
Zehong Lin
Yuyi Mao
Jun Zhang
FedML
24
22
0
20 Jul 2023
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
267
402
0
09 Apr 2018
1