Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.13170
Cited By
Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning
22 May 2023
Kai Yi
Laurent Condat
Peter Richtárik
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning"
4 / 4 papers shown
Title
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity
A. Maranjyan
M. Safaryan
Peter Richtárik
26
13
0
28 Oct 2022
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
173
410
0
14 Jul 2021
FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout
Samuel Horváth
Stefanos Laskaridis
Mario Almeida
Ilias Leondiadis
Stylianos I. Venieris
Nicholas D. Lane
176
267
0
26 Feb 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
1