ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.02828
  4. Cited By
Local SGD: Unified Theory and New Efficient Methods

Local SGD: Unified Theory and New Efficient Methods

3 November 2020
Eduard A. Gorbunov
Filip Hanzely
Peter Richtárik
    FedML
ArXivPDFHTML

Papers citing "Local SGD: Unified Theory and New Efficient Methods"

25 / 75 papers shown
Title
The Role of Local Steps in Local SGD
The Role of Local Steps in Local SGD
Tiancheng Qin
S. Rasoul Etesami
César A. Uribe
8
4
0
14 Mar 2022
Distributed Methods with Absolute Compression and Error Compensation
Distributed Methods with Absolute Compression and Error Compensation
Marina Danilova
Eduard A. Gorbunov
19
5
0
04 Mar 2022
ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication
  Acceleration! Finally!
ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication Acceleration! Finally!
Konstantin Mishchenko
Grigory Malinovsky
Sebastian U. Stich
Peter Richtárik
11
148
0
18 Feb 2022
BEER: Fast $O(1/T)$ Rate for Decentralized Nonconvex Optimization with
  Communication Compression
BEER: Fast O(1/T)O(1/T)O(1/T) Rate for Decentralized Nonconvex Optimization with Communication Compression
Haoyu Zhao
Boyue Li
Zhize Li
Peter Richtárik
Yuejie Chi
19
48
0
31 Jan 2022
Faster Convergence of Local SGD for Over-Parameterized Models
Faster Convergence of Local SGD for Over-Parameterized Models
Tiancheng Qin
S. Rasoul Etesami
César A. Uribe
FedML
28
6
0
30 Jan 2022
Server-Side Stepsizes and Sampling Without Replacement Provably Help in
  Federated Optimization
Server-Side Stepsizes and Sampling Without Replacement Provably Help in Federated Optimization
Grigory Malinovsky
Konstantin Mishchenko
Peter Richtárik
FedML
9
24
0
26 Jan 2022
Faster Rates for Compressed Federated Learning with Client-Variance
  Reduction
Faster Rates for Compressed Federated Learning with Client-Variance Reduction
Haoyu Zhao
Konstantin Burlachenko
Zhize Li
Peter Richtárik
FedML
22
13
0
24 Dec 2021
Basis Matters: Better Communication-Efficient Second Order Methods for
  Federated Learning
Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning
Xun Qian
Rustem Islamov
M. Safaryan
Peter Richtárik
FedML
19
23
0
02 Nov 2021
Exploiting Heterogeneity in Robust Federated Best-Arm Identification
Exploiting Heterogeneity in Robust Federated Best-Arm Identification
A. Mitra
Hamed Hassani
George Pappas
FedML
31
26
0
13 Sep 2021
Towards Out-Of-Distribution Generalization: A Survey
Towards Out-Of-Distribution Generalization: A Survey
Jiashuo Liu
Zheyan Shen
Yue He
Xingxuan Zhang
Renzhe Xu
Han Yu
Peng Cui
CML
OOD
29
515
0
31 Aug 2021
FedChain: Chained Algorithms for Near-Optimal Communication Cost in
  Federated Learning
FedChain: Chained Algorithms for Near-Optimal Communication Cost in Federated Learning
Charlie Hou
K. K. Thekumparampil
Giulia Fanti
Sewoong Oh
FedML
30
14
0
16 Aug 2021
An Operator Splitting View of Federated Learning
An Operator Splitting View of Federated Learning
Saber Malekmohammadi
K. Shaloudegi
Zeou Hu
Yaoliang Yu
FedML
24
2
0
12 Aug 2021
FedPAGE: A Fast Local Stochastic Gradient Method for
  Communication-Efficient Federated Learning
FedPAGE: A Fast Local Stochastic Gradient Method for Communication-Efficient Federated Learning
Haoyu Zhao
Zhize Li
Peter Richtárik
FedML
17
29
0
10 Aug 2021
A Non-parametric View of FedAvg and FedProx: Beyond Stationary Points
A Non-parametric View of FedAvg and FedProx: Beyond Stationary Points
Lili Su
Jiaming Xu
Pengkun Yang
FedML
14
13
0
29 Jun 2021
Secure Distributed Training at Scale
Secure Distributed Training at Scale
Eduard A. Gorbunov
Alexander Borzunov
Michael Diskin
Max Ryabinin
FedML
10
15
0
21 Jun 2021
Decentralized Local Stochastic Extra-Gradient for Variational
  Inequalities
Decentralized Local Stochastic Extra-Gradient for Variational Inequalities
Aleksandr Beznosikov
Pavel Dvurechensky
Anastasia Koloskova
V. Samokhin
Sebastian U. Stich
Alexander Gasnikov
24
43
0
15 Jun 2021
FedNL: Making Newton-Type Methods Applicable to Federated Learning
FedNL: Making Newton-Type Methods Applicable to Federated Learning
M. Safaryan
Rustem Islamov
Xun Qian
Peter Richtárik
FedML
20
77
0
05 Jun 2021
FedDR -- Randomized Douglas-Rachford Splitting Algorithms for Nonconvex
  Federated Composite Optimization
FedDR -- Randomized Douglas-Rachford Splitting Algorithms for Nonconvex Federated Composite Optimization
Quoc Tran-Dinh
Nhan H. Pham
Dzung Phan
Lam M. Nguyen
FedML
16
39
0
05 Mar 2021
Moshpit SGD: Communication-Efficient Decentralized Training on
  Heterogeneous Unreliable Devices
Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices
Max Ryabinin
Eduard A. Gorbunov
Vsevolod Plokhotnyuk
Gennady Pekhimenko
24
31
0
04 Mar 2021
Personalized Federated Learning: A Unified Framework and Universal
  Optimization Techniques
Personalized Federated Learning: A Unified Framework and Universal Optimization Techniques
Filip Hanzely
Boxin Zhao
Mladen Kolar
FedML
19
52
0
19 Feb 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity
  and Sparse Gradients
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
Efficient Algorithms for Federated Saddle Point Optimization
Efficient Algorithms for Federated Saddle Point Optimization
Charlie Hou
K. K. Thekumparampil
Giulia Fanti
Sewoong Oh
FedML
16
23
0
12 Feb 2021
Linearly Converging Error Compensated SGD
Linearly Converging Error Compensated SGD
Eduard A. Gorbunov
D. Kovalev
Dmitry Makarenko
Peter Richtárik
163
77
0
23 Oct 2020
Communication-Efficient and Distributed Learning Over Wireless Networks:
  Principles and Applications
Communication-Efficient and Distributed Learning Over Wireless Networks: Principles and Applications
Jihong Park
S. Samarakoon
Anis Elgabli
Joongheon Kim
M. Bennis
Seong-Lyun Kim
Mérouane Debbah
23
161
0
06 Aug 2020
FedPAQ: A Communication-Efficient Federated Learning Method with
  Periodic Averaging and Quantization
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
159
760
0
28 Sep 2019
Previous
12