ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.09721
  4. Cited By
Communication-Efficient and Byzantine-Robust Distributed Learning with
  Error Feedback

Communication-Efficient and Byzantine-Robust Distributed Learning with Error Feedback

21 November 2019
Avishek Ghosh
R. Maity
S. Kadhe
A. Mazumdar
K. Ramchandran
    FedML
ArXivPDFHTML

Papers citing "Communication-Efficient and Byzantine-Robust Distributed Learning with Error Feedback"

4 / 4 papers shown
Title
Communication Compression for Byzantine Robust Learning: New Efficient
  Algorithms and Improved Rates
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Ahmad Rammal
Kaja Gruntkowska
Nikita Fedin
Eduard A. Gorbunov
Peter Richtárik
35
5
0
15 Oct 2023
Robust Distributed Learning Against Both Distributional Shifts and
  Byzantine Attacks
Robust Distributed Learning Against Both Distributional Shifts and Byzantine Attacks
Guanqiang Zhou
Ping Xu
Yue Wang
Zhi Tian
OOD
FedML
19
4
0
29 Oct 2022
Patterns, predictions, and actions: A story about machine learning
Patterns, predictions, and actions: A story about machine learning
Moritz Hardt
Benjamin Recht
SSL
AI4TS
AI4CE
38
31
0
10 Feb 2021
Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees
Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees
Richeng Jin
Yufan Huang
Xiaofan He
H. Dai
Tianfu Wu
FedML
19
63
0
25 Feb 2020
1