ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.09925
  4. Cited By
Distributed Fixed Point Methods with Compressed Iterates

Distributed Fixed Point Methods with Compressed Iterates

20 December 2019
Sélim Chraibi
Ahmed Khaled
D. Kovalev
Peter Richtárik
Adil Salim
Martin Takávc
    FedML
ArXivPDFHTML

Papers citing "Distributed Fixed Point Methods with Compressed Iterates"

4 / 4 papers shown
Title
DoCoFL: Downlink Compression for Cross-Device Federated Learning
DoCoFL: Downlink Compression for Cross-Device Federated Learning
Ron Dorfman
S. Vargaftik
Y. Ben-Itzhak
Kfir Y. Levy
FedML
32
19
0
01 Feb 2023
Federated Random Reshuffling with Compression and Variance Reduction
Federated Random Reshuffling with Compression and Variance Reduction
Grigory Malinovsky
Peter Richtárik
FedML
27
10
0
08 May 2022
Partial Variable Training for Efficient On-Device Federated Learning
Partial Variable Training for Efficient On-Device Federated Learning
Tien-Ju Yang
Dhruv Guliani
F. Beaufays
Giovanni Motta
FedML
27
25
0
11 Oct 2021
FedPAQ: A Communication-Efficient Federated Learning Method with
  Periodic Averaging and Quantization
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
174
763
0
28 Sep 2019
1