ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.10032
  4. Cited By
Communication-Efficient Federated Learning With Data and Client
  Heterogeneity

Communication-Efficient Federated Learning With Data and Client Heterogeneity

20 June 2022
Hossein Zakerinia
Shayan Talaei
Giorgi Nadiradze
Dan Alistarh
    FedML
ArXivPDFHTML

Papers citing "Communication-Efficient Federated Learning With Data and Client Heterogeneity"

3 / 3 papers shown
Title
Achieving Dimension-Free Communication in Federated Learning via Zeroth-Order Optimization
Achieving Dimension-Free Communication in Federated Learning via Zeroth-Order Optimization
Zhe Li
Bicheng Ying
Zidong Liu
Haibo Yang
Haibo Yang
FedML
59
3
0
24 May 2024
DASHA: Distributed Nonconvex Optimization with Communication
  Compression, Optimal Oracle Complexity, and No Client Synchronization
DASHA: Distributed Nonconvex Optimization with Communication Compression, Optimal Oracle Complexity, and No Client Synchronization
A. Tyurin
Peter Richtárik
37
17
0
02 Feb 2022
FedPAQ: A Communication-Efficient Federated Learning Method with
  Periodic Averaging and Quantization
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
157
760
0
28 Sep 2019
1