ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.14591
  4. Cited By
Bidirectional compression in heterogeneous settings for distributed or
  federated learning with partial participation: tight convergence guarantees

Bidirectional compression in heterogeneous settings for distributed or federated learning with partial participation: tight convergence guarantees

25 June 2020
Constantin Philippenko
Aymeric Dieuleveut
    FedML
ArXivPDFHTML

Papers citing "Bidirectional compression in heterogeneous settings for distributed or federated learning with partial participation: tight convergence guarantees"

16 / 16 papers shown
Title
BOLD: Boolean Logic Deep Learning
BOLD: Boolean Logic Deep Learning
Van Minh Nguyen
Cristian Ocampo
Aymen Askri
Louis Leconte
Ba-Hien Tran
AI4CE
37
0
0
25 May 2024
Communication Compression for Byzantine Robust Learning: New Efficient
  Algorithms and Improved Rates
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Ahmad Rammal
Kaja Gruntkowska
Nikita Fedin
Eduard A. Gorbunov
Peter Richtárik
37
5
0
15 Oct 2023
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression
Yutong He
Xinmeng Huang
Yiming Chen
W. Yin
Kun Yuan
28
7
0
12 May 2023
ELF: Federated Langevin Algorithms with Primal, Dual and Bidirectional
  Compression
ELF: Federated Langevin Algorithms with Primal, Dual and Bidirectional Compression
Avetik G. Karagulyan
Peter Richtárik
FedML
26
6
0
08 Mar 2023
Adaptive Compression for Communication-Efficient Distributed Training
Adaptive Compression for Communication-Efficient Distributed Training
Maksim Makarenko
Elnur Gasanov
Rustem Islamov
Abdurakhmon Sadiev
Peter Richtárik
18
12
0
31 Oct 2022
FLamby: Datasets and Benchmarks for Cross-Silo Federated Learning in
  Realistic Healthcare Settings
FLamby: Datasets and Benchmarks for Cross-Silo Federated Learning in Realistic Healthcare Settings
Jean Ogier du Terrail
Samy Ayed
Edwige Cyffers
Felix Grimberg
Chaoyang He
...
Sai Praneeth Karimireddy
Marco Lorenzi
Giovanni Neglia
Marc Tommasi
M. Andreux
FedML
38
142
0
10 Oct 2022
Downlink Compression Improves TopK Sparsification
Downlink Compression Improves TopK Sparsification
William Zou
H. Sterck
Jun Liu
14
0
0
30 Sep 2022
Towards Efficient Communications in Federated Learning: A Contemporary
  Survey
Towards Efficient Communications in Federated Learning: A Contemporary Survey
Zihao Zhao
Yuzhu Mao
Yang Liu
Linqi Song
Ouyang Ye
Xinlei Chen
Wenbo Ding
FedML
51
59
0
02 Aug 2022
Federated Expectation Maximization with heterogeneity mitigation and
  variance reduction
Federated Expectation Maximization with heterogeneity mitigation and variance reduction
Aymeric Dieuleveut
G. Fort
Eric Moulines
Geneviève Robin
FedML
23
5
0
03 Nov 2021
Basis Matters: Better Communication-Efficient Second Order Methods for
  Federated Learning
Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning
Xun Qian
Rustem Islamov
M. Safaryan
Peter Richtárik
FedML
19
23
0
02 Nov 2021
Federated learning and next generation wireless communications: A survey
  on bidirectional relationship
Federated learning and next generation wireless communications: A survey on bidirectional relationship
Debaditya Shome
Omer Waqar
Wali Ullah Khan
26
31
0
14 Oct 2021
ProgFed: Effective, Communication, and Computation Efficient Federated
  Learning by Progressive Training
ProgFed: Effective, Communication, and Computation Efficient Federated Learning by Progressive Training
Hui-Po Wang
Sebastian U. Stich
Yang He
Mario Fritz
FedML
AI4CE
28
46
0
11 Oct 2021
Fast Federated Learning in the Presence of Arbitrary Device
  Unavailability
Fast Federated Learning in the Presence of Arbitrary Device Unavailability
Xinran Gu
Kaixuan Huang
Jingzhao Zhang
Longbo Huang
FedML
22
94
0
08 Jun 2021
FedNL: Making Newton-Type Methods Applicable to Federated Learning
FedNL: Making Newton-Type Methods Applicable to Federated Learning
M. Safaryan
Rustem Islamov
Xun Qian
Peter Richtárik
FedML
20
77
0
05 Jun 2021
Learned Gradient Compression for Distributed Deep Learning
Learned Gradient Compression for Distributed Deep Learning
L. Abrahamyan
Yiming Chen
Giannis Bekoulis
Nikos Deligiannis
32
45
0
16 Mar 2021
FedPAQ: A Communication-Efficient Federated Learning Method with
  Periodic Averaging and Quantization
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
165
760
0
28 Sep 2019
1