ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.09721
  4. Cited By
Communication-Efficient and Byzantine-Robust Distributed Learning with
  Error Feedback
v1v2v3v4v5 (latest)

Communication-Efficient and Byzantine-Robust Distributed Learning with Error Feedback

IEEE Journal on Selected Areas in Information Theory (JSAIT), 2019
21 November 2019
Avishek Ghosh
R. Maity
S. Kadhe
A. Mazumdar
Kannan Ramchandran
    FedML
ArXiv (abs)PDFHTML

Papers citing "Communication-Efficient and Byzantine-Robust Distributed Learning with Error Feedback"

13 / 13 papers shown
Incentivize Contribution and Learn Parameters Too: Federated Learning with Strategic Data Owners
Incentivize Contribution and Learn Parameters Too: Federated Learning with Strategic Data Owners
Drashthi Doshi
Aditya Vema Reddy Kesari
Avishek Ghosh
Avishek Ghosh
Suhas S Kowshik
FedMLTDI
472
0
0
17 May 2025
Byzantine-Robust and Communication-Efficient Distributed Learning via
  Compressed Momentum Filtering
Byzantine-Robust and Communication-Efficient Distributed Learning via Compressed Momentum Filtering
Changxin Liu
Yanghao Li
Yuhao Yi
Karl H. Johansson
FedML
239
1
0
13 Sep 2024
Communication Compression for Byzantine Robust Learning: New Efficient
  Algorithms and Improved Rates
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Ahmad Rammal
Kaja Gruntkowska
Nikita Fedin
Eduard A. Gorbunov
Peter Richtárik
421
13
0
15 Oct 2023
Optimal Compression of Unit Norm Vectors in the High Distortion Regime
Optimal Compression of Unit Norm Vectors in the High Distortion RegimeInternational Symposium on Information Theory (ISIT), 2023
He Zhu
Avishek Ghosh
A. Mazumdar
266
3
0
16 Jul 2023
Byzantine-Resilient Federated Learning at Edge
Byzantine-Resilient Federated Learning at EdgeIEEE transactions on computers (IEEE Trans. Comput.), 2023
Youming Tao
Sijia Cui
Wenlu Xu
Haofei Yin
Dongxiao Yu
W. Liang
Xiuzhen Cheng
FedML
297
27
0
18 Mar 2023
FedREP: A Byzantine-Robust, Communication-Efficient and
  Privacy-Preserving Framework for Federated Learning
FedREP: A Byzantine-Robust, Communication-Efficient and Privacy-Preserving Framework for Federated Learning
Yi-Rui Yang
Kun Wang
Wulu Li
FedML
295
6
0
09 Mar 2023
Analysis of Error Feedback in Federated Non-Convex Optimization with
  Biased Compression
Analysis of Error Feedback in Federated Non-Convex Optimization with Biased Compression
Xiaoyun Li
Ping Li
FedML
257
6
0
25 Nov 2022
Robust Distributed Learning Against Both Distributional Shifts and
  Byzantine Attacks
Robust Distributed Learning Against Both Distributional Shifts and Byzantine AttacksIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2022
Guanqiang Zhou
Ping Xu
Yue Wang
Zhi Tian
OODFedML
169
6
0
29 Oct 2022
Variance Reduction is an Antidote to Byzantines: Better Rates, Weaker
  Assumptions and Communication Compression as a Cherry on the Top
Variance Reduction is an Antidote to Byzantines: Better Rates, Weaker Assumptions and Communication Compression as a Cherry on the Top
Eduard A. Gorbunov
Samuel Horváth
Peter Richtárik
Gauthier Gidel
AAML
371
0
0
01 Jun 2022
Stochastic Alternating Direction Method of Multipliers for
  Byzantine-Robust Distributed Learning
Stochastic Alternating Direction Method of Multipliers for Byzantine-Robust Distributed LearningSignal Processing (Signal Process.), 2021
Feng-Shih Lin
Weiyu Li
Qing Ling
FedML
378
6
0
13 Jun 2021
BROADCAST: Reducing Both Stochastic and Compression Noise to Robustify
  Communication-Efficient Federated Learning
BROADCAST: Reducing Both Stochastic and Compression Noise to Robustify Communication-Efficient Federated LearningIEEE Transactions on Signal and Information Processing over Networks (TSIPN), 2021
He Zhu
Qing Ling
FedMLAAML
349
24
0
14 Apr 2021
Distributed Newton Can Communicate Less and Resist Byzantine Workers
Distributed Newton Can Communicate Less and Resist Byzantine Workers
Avishek Ghosh
R. Maity
A. Mazumdar
FedML
231
38
0
15 Jun 2020
Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees
Stochastic-Sign SGD for Federated Learning with Theoretical GuaranteesIEEE Transactions on Neural Networks and Learning Systems (IEEE TNNLS), 2020
Richeng Jin
Yufan Huang
Xiaofan He
H. Dai
Tianfu Wu
FedML
421
70
0
25 Feb 2020
1
Page 1 of 1