ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.19059
  4. Cited By
Escaping Saddle Points in Heterogeneous Federated Learning via
  Distributed SGD with Communication Compression

Escaping Saddle Points in Heterogeneous Federated Learning via Distributed SGD with Communication Compression

29 October 2023
Sijin Chen
Zhize Li
Yuejie Chi
    FedML
ArXivPDFHTML

Papers citing "Escaping Saddle Points in Heterogeneous Federated Learning via Distributed SGD with Communication Compression"

3 / 3 papers shown
Title
Convergence Analysis of Asynchronous Federated Learning with Gradient Compression for Non-Convex Optimization
Convergence Analysis of Asynchronous Federated Learning with Gradient Compression for Non-Convex Optimization
Diying Yang
Yingwei Hou
Danyang Xiao
Weigang Wu
FedML
34
0
0
28 Apr 2025
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Laurent Condat
A. Maranjyan
Peter Richtárik
34
3
0
07 Mar 2024
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern
  Error Feedback
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback
Ilyas Fatkhullin
Igor Sokolov
Eduard A. Gorbunov
Zhize Li
Peter Richtárik
42
44
0
07 Oct 2021
1