ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1805.10032
  4. Cited By
Zeno: Distributed Stochastic Gradient Descent with Suspicion-based
  Fault-tolerance

Zeno: Distributed Stochastic Gradient Descent with Suspicion-based Fault-tolerance

25 May 2018
Cong Xie
Oluwasanmi Koyejo
Indranil Gupta
    FedML
ArXivPDFHTML

Papers citing "Zeno: Distributed Stochastic Gradient Descent with Suspicion-based Fault-tolerance"

8 / 8 papers shown
Title
A Robust Classification Framework for Byzantine-Resilient Stochastic
  Gradient Descent
A Robust Classification Framework for Byzantine-Resilient Stochastic Gradient Descent
Shashank Reddy Chirra
K. Nadimpalli
Shrisha Rao
24
0
0
16 Jan 2023
FedCut: A Spectral Analysis Framework for Reliable Detection of
  Byzantine Colluders
FedCut: A Spectral Analysis Framework for Reliable Detection of Byzantine Colluders
Hanlin Gu
Lixin Fan
Xingxing Tang
Qiang Yang
AAML
FedML
22
1
0
24 Nov 2022
Secure Distributed Optimization Under Gradient Attacks
Secure Distributed Optimization Under Gradient Attacks
Shuhua Yu
S. Kar
32
13
0
28 Oct 2022
Byzantine-Resilient Decentralized Stochastic Optimization with Robust
  Aggregation Rules
Byzantine-Resilient Decentralized Stochastic Optimization with Robust Aggregation Rules
Zhaoxian Wu
Tianyi Chen
Qing Ling
31
36
0
09 Jun 2022
Byzantine Fault Tolerance in Distributed Machine Learning : a Survey
Byzantine Fault Tolerance in Distributed Machine Learning : a Survey
Djamila Bouhata
Hamouma Moumen
Moumen Hamouma
Ahcène Bounceur
AI4CE
27
7
0
05 May 2022
BEV-SGD: Best Effort Voting SGD for Analog Aggregation Based Federated
  Learning against Byzantine Attackers
BEV-SGD: Best Effort Voting SGD for Analog Aggregation Based Federated Learning against Byzantine Attackers
Xin-Yue Fan
Yue Wang
Yan Huo
Zhi Tian
FedML
22
23
0
18 Oct 2021
A Survey on Fault-tolerance in Distributed Optimization and Machine
  Learning
A Survey on Fault-tolerance in Distributed Optimization and Machine Learning
Shuo Liu
AI4CE
OOD
50
13
0
16 Jun 2021
DETOX: A Redundancy-based Framework for Faster and More Robust Gradient
  Aggregation
DETOX: A Redundancy-based Framework for Faster and More Robust Gradient Aggregation
Shashank Rajput
Hongyi Wang
Zachary B. Charles
Dimitris Papailiopoulos
FedML
6
131
0
29 Jul 2019
1