ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.14694
  4. Cited By
Learning Critically: Selective Self Distillation in Federated Learning on Non-IID Data

Learning Critically: Selective Self Distillation in Federated Learning on Non-IID Data

20 April 2025
Yuting He
Yiqiang Chen
Xiaodong Yang
H. Yu
Yi-Hua Huang
Yang Gu
    FedML
ArXivPDFHTML

Papers citing "Learning Critically: Selective Self Distillation in Federated Learning on Non-IID Data"

10 / 10 papers shown
Title
Federated Progressive Self-Distillation with Logits Calibration for
  Personalized IIoT Edge Intelligence
Federated Progressive Self-Distillation with Logits Calibration for Personalized IIoT Edge Intelligence
Yingchao Wang
Wenqi Niu
79
0
0
30 Nov 2024
Non-IID data in Federated Learning: A Survey with Taxonomy, Metrics,
  Methods, Frameworks and Future Directions
Non-IID data in Federated Learning: A Survey with Taxonomy, Metrics, Methods, Frameworks and Future Directions
Daniel Gutiérrez
David Solans
Mikko A. Heikkilä
A. Vitaletti
Nicolas Kourtellis
Aris Anagnostopoulos
I. Chatzigiannakis
OOD
84
0
0
19 Nov 2024
FedProK: Trustworthy Federated Class-Incremental Learning via
  Prototypical Feature Knowledge Transfer
FedProK: Trustworthy Federated Class-Incremental Learning via Prototypical Feature Knowledge Transfer
Xin Gao
Xin Yang
Hao Yu
Yan Kang
Tianrui Li
CLL
33
1
0
04 May 2024
Federated Distillation: A Survey
Federated Distillation: A Survey
Lin Li
Jianping Gou
Baosheng Yu
Lan Du
Zhang Yiand Dacheng Tao
DD
FedML
46
4
0
02 Apr 2024
FedAL: Black-Box Federated Knowledge Distillation Enabled by Adversarial
  Learning
FedAL: Black-Box Federated Knowledge Distillation Enabled by Adversarial Learning
Pengchao Han
Xingyan Shi
Jianwei Huang
FedML
20
3
0
28 Nov 2023
UNIDEAL: Curriculum Knowledge Distillation Federated Learning
UNIDEAL: Curriculum Knowledge Distillation Federated Learning
Yuwen Yang
Chang Liu
Xun Cai
Suizhi Huang
Hongtao Lu
Yue Ding
FedML
29
8
0
16 Sep 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
16
16
0
08 Aug 2023
Adapter-based Selective Knowledge Distillation for Federated
  Multi-domain Meeting Summarization
Adapter-based Selective Knowledge Distillation for Federated Multi-domain Meeting Summarization
Xiachong Feng
Xiaocheng Feng
Xiyuan Du
MingSung Kan
Bing Qin
FedML
11
5
0
07 Aug 2023
Adaptive Self-Distillation for Minimizing Client Drift in Heterogeneous
  Federated Learning
Adaptive Self-Distillation for Minimizing Client Drift in Heterogeneous Federated Learning
M.Yashwanth
Gaurav Kumar Nayak
Aryaveer Singh
Yogesh Singh
Anirban Chakraborty
FedML
19
1
0
31 May 2023
Knowledge Distillation for Federated Learning: a Practical Guide
Knowledge Distillation for Federated Learning: a Practical Guide
Alessio Mora
Irene Tenison
Paolo Bellavista
Irina Rish
FedML
14
17
0
09 Nov 2022
1