ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.08036
  4. Cited By
No One Left Behind: Inclusive Federated Learning over Heterogeneous
  Devices

No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices

16 February 2022
Ruixuan Liu
Fangzhao Wu
Chuhan Wu
Yanlin Wang
Lingjuan Lyu
Hong Chen
Xing Xie
    FedML
ArXivPDFHTML

Papers citing "No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices"

12 / 12 papers shown
Title
Moss: Proxy Model-based Full-Weight Aggregation in Federated Learning with Heterogeneous Models
Y. Cai
Ziqi Zhang
Ding Li
Yao Guo
Xiangqun Chen
70
0
0
13 Mar 2025
Tackling Feature and Sample Heterogeneity in Decentralized Multi-Task Learning: A Sheaf-Theoretic Approach
Tackling Feature and Sample Heterogeneity in Decentralized Multi-Task Learning: A Sheaf-Theoretic Approach
Chaouki Ben Issaid
Praneeth Vepakomma
Mehdi Bennis
89
0
0
03 Feb 2025
Heterogeneity-aware Personalized Federated Learning via Adaptive Dual-Agent Reinforcement Learning
Heterogeneity-aware Personalized Federated Learning via Adaptive Dual-Agent Reinforcement Learning
Xi Chen
Qin Li
Haibin Cai
Ting Wang
65
0
0
28 Jan 2025
Breaking the Memory Wall for Heterogeneous Federated Learning via Progressive Training
Breaking the Memory Wall for Heterogeneous Federated Learning via Progressive Training
Yebo Wu
Li Li
Chengzhong Xu
FedML
46
13
0
20 Apr 2024
When Foundation Model Meets Federated Learning: Motivations, Challenges, and Future Directions
When Foundation Model Meets Federated Learning: Motivations, Challenges, and Future Directions
Weiming Zhuang
Chen Chen
Lingjuan Lyu
Chong Chen
Yaochu Jin
Lingjuan Lyu
AIFin
AI4CE
99
91
0
27 Jun 2023
When Federated Learning Meets Pre-trained Language Models'
  Parameter-Efficient Tuning Methods
When Federated Learning Meets Pre-trained Language Models' Parameter-Efficient Tuning Methods
Zhuo Zhang
Yuanhang Yang
Yong Dai
Lizhen Qu
Zenglin Xu
FedML
53
66
0
20 Dec 2022
A Snapshot of the Frontiers of Client Selection in Federated Learning
A Snapshot of the Frontiers of Client Selection in Federated Learning
Gergely Németh
M. Lozano
Novi Quadrianto
Nuria Oliver
FedML
115
14
0
27 Sep 2022
Asynchronous Federated Learning on Heterogeneous Devices: A Survey
Asynchronous Federated Learning on Heterogeneous Devices: A Survey
Chenhao Xu
Youyang Qu
Yong Xiang
Longxiang Gao
FedML
104
246
0
09 Sep 2021
Privacy and Robustness in Federated Learning: Attacks and Defenses
Privacy and Robustness in Federated Learning: Attacks and Defenses
Lingjuan Lyu
Han Yu
Xingjun Ma
Chen Chen
Lichao Sun
Jun Zhao
Qiang Yang
Philip S. Yu
FedML
183
358
0
07 Dec 2020
FedNER: Privacy-preserving Medical Named Entity Recognition with
  Federated Learning
FedNER: Privacy-preserving Medical Named Entity Recognition with Federated Learning
Suyu Ge
Fangzhao Wu
Chuhan Wu
Tao Qi
Yongfeng Huang
Xing Xie
158
57
0
20 Mar 2020
A Survey on Bias and Fairness in Machine Learning
A Survey on Bias and Fairness in Machine Learning
Ninareh Mehrabi
Fred Morstatter
N. Saxena
Kristina Lerman
Aram Galstyan
SyDa
FaML
376
4,237
0
23 Aug 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
309
7,005
0
20 Apr 2018
1