ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.11198
  4. Cited By
Achieving Linear Speedup in Asynchronous Federated Learning with
  Heterogeneous Clients

Achieving Linear Speedup in Asynchronous Federated Learning with Heterogeneous Clients

17 February 2024
Xiaolu Wang
Zijian Li
Shi Jin
Jun Zhang
    FedML
ArXiv (abs)PDFHTML

Papers citing "Achieving Linear Speedup in Asynchronous Federated Learning with Heterogeneous Clients"

5 / 5 papers shown
Title
RockNet: Distributed Learning on Ultra-Low-Power Devices
RockNet: Distributed Learning on Ultra-Low-Power DevicesACM Transactions on Cyber-Physical Systems (TCPS), 2025
Alexander Gräfe
Fabian Mager
Marco Zimmerling
Sebastian Trimpe
123
0
0
15 Oct 2025
Convergence Analysis of Asynchronous Federated Learning with Gradient Compression for Non-Convex Optimization
Convergence Analysis of Asynchronous Federated Learning with Gradient Compression for Non-Convex Optimization
Diying Yang
Yingwei Hou
Danyang Xiao
FedML
262
0
0
28 Apr 2025
FedMHO: Heterogeneous One-Shot Federated Learning Towards Resource-Constrained Edge Devices
FedMHO: Heterogeneous One-Shot Federated Learning Towards Resource-Constrained Edge Devices
Dezhong Yao
Yuexin Shi
Tongtong Liu
Zhiqiang Xu
236
3
0
12 Feb 2025
Sequential Federated Learning in Hierarchical Architecture on Non-IID
  Datasets
Sequential Federated Learning in Hierarchical Architecture on Non-IID DatasetsIEEE Transactions on Mobile Computing (IEEE TMC), 2024
Xingrun Yan
Shiyuan Zuo
Rongfei Fan
Han Hu
Li Shen
Puning Zhao
Yong Luo
FedML
188
0
0
19 Aug 2024
Dual-Delayed Asynchronous SGD for Arbitrarily Heterogeneous Data
Dual-Delayed Asynchronous SGD for Arbitrarily Heterogeneous Data
Xiaolu Wang
Yuchang Sun
Hoi-To Wai
Jun Zhang
200
1
0
27 May 2024
1