Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2306.03240
Cited By
Improving Accelerated Federated Learning with Compression and Importance Sampling
5 June 2023
Michal Grudzieñ
Grigory Malinovsky
Peter Richtárik
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improving Accelerated Federated Learning with Compression and Importance Sampling"
8 / 8 papers shown
Title
A-FedPD: Aligning Dual-Drift is All Federated Primal-Dual Learning Needs
Yan Sun
Li Shen
Dacheng Tao
FedML
18
0
0
27 Sep 2024
Correlated Quantization for Faster Nonconvex Distributed Optimization
Andrei Panferov
Yury Demidovich
Ahmad Rammal
Peter Richtárik
MQ
23
4
0
10 Jan 2024
DualFL: A Duality-based Federated Learning Algorithm with Communication Acceleration in the General Convex Regime
Jongho Park
Jinchao Xu
FedML
42
1
0
17 May 2023
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity
A. Maranjyan
M. Safaryan
Peter Richtárik
26
13
0
28 Oct 2022
Permutation Compressors for Provably Faster Distributed Nonconvex Optimization
Rafal Szlendak
A. Tyurin
Peter Richtárik
113
35
0
07 Oct 2021
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
173
410
0
14 Jul 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
157
758
0
28 Sep 2019
1