ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.07053
  4. Cited By
Linear Convergence in Federated Learning: Tackling Client Heterogeneity
  and Sparse Gradients

Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients

14 February 2021
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
    FedML
ArXivPDFHTML

Papers citing "Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients"

4 / 4 papers shown
Title
Straggler-Resilient Federated Learning: Leveraging the Interplay Between
  Statistical Accuracy and System Heterogeneity
Straggler-Resilient Federated Learning: Leveraging the Interplay Between Statistical Accuracy and System Heterogeneity
Amirhossein Reisizadeh
Isidoros Tziotis
Hamed Hassani
Aryan Mokhtari
Ramtin Pedarsani
FedML
132
80
0
28 Dec 2020
Linearly Converging Error Compensated SGD
Linearly Converging Error Compensated SGD
Eduard A. Gorbunov
D. Kovalev
Dmitry Makarenko
Peter Richtárik
131
69
0
23 Oct 2020
FedPAQ: A Communication-Efficient Federated Learning Method with
  Periodic Averaging and Quantization
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
118
645
0
28 Sep 2019
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
110
1,090
0
16 Aug 2016
1