ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
  • Feedback
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.10336
  4. Cited By
Distributed Byzantine Tolerant Stochastic Gradient Descent in the Era of
  Big Data
v1v2v3 (latest)

Distributed Byzantine Tolerant Stochastic Gradient Descent in the Era of Big Data

27 February 2019
Richeng Jin
Xiaofan He
H. Dai
    FedML
ArXiv (abs)PDFHTML

Papers citing "Distributed Byzantine Tolerant Stochastic Gradient Descent in the Era of Big Data"

4 / 4 papers shown
Title
Byzantine Fault Tolerance in Distributed Machine Learning : a Survey
Byzantine Fault Tolerance in Distributed Machine Learning : a Survey
Djamila Bouhata
Hamouma Moumen
Moumen Hamouma
Ahcène Bounceur
AI4CE
161
8
0
05 May 2022
Byzantine-Robust Variance-Reduced Federated Learning over Distributed
  Non-i.i.d. Data
Byzantine-Robust Variance-Reduced Federated Learning over Distributed Non-i.i.d. Data
Jie Peng
Zhaoxian Wu
Qing Ling
Tianyi Chen
OODFedML
120
26
0
17 Sep 2020
ByGARS: Byzantine SGD with Arbitrary Number of Attackers
ByGARS: Byzantine SGD with Arbitrary Number of Attackers
Jayanth Reddy Regatti
Hao Chen
Abhishek Gupta
FedMLAAML
87
4
0
24 Jun 2020
BRIDGE: Byzantine-resilient Decentralized Gradient Descent
BRIDGE: Byzantine-resilient Decentralized Gradient Descent
Cheng Fang
Zhixiong Yang
W. Bajwa
143
109
0
21 Aug 2019
1