ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.01405
  4. Cited By
FedRAD: Federated Robust Adaptive Distillation

FedRAD: Federated Robust Adaptive Distillation

2 December 2021
Stefán Páll Sturluson
Samuel Trew
Luis Muñoz-González
Matei Grama
Jonathan Passerat-Palmbach
Daniel Rueckert
A. Alansary
    FedML
ArXivPDFHTML

Papers citing "FedRAD: Federated Robust Adaptive Distillation"

4 / 4 papers shown
Title
On the Byzantine-Resilience of Distillation-Based Federated Learning
On the Byzantine-Resilience of Distillation-Based Federated Learning
Christophe Roux
Max Zimmer
S. Pokutta
AAML
49
1
0
19 Feb 2024
A Survey on Vulnerability of Federated Learning: A Learning Algorithm
  Perspective
A Survey on Vulnerability of Federated Learning: A Learning Algorithm Perspective
Xianghua Xie
Chen Hu
Hanchi Ren
Jingjing Deng
FedML
AAML
25
19
0
27 Nov 2023
Decentralized Learning with Multi-Headed Distillation
Decentralized Learning with Multi-Headed Distillation
A. Zhmoginov
Mark Sandler
Nolan Miller
Gus Kristiansen
Max Vladymyrov
FedML
22
4
0
28 Nov 2022
Towards Efficient Communications in Federated Learning: A Contemporary
  Survey
Towards Efficient Communications in Federated Learning: A Contemporary Survey
Zihao Zhao
Yuzhu Mao
Yang Liu
Linqi Song
Ouyang Ye
Xinlei Chen
Wenbo Ding
FedML
43
59
0
02 Aug 2022
1