Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1810.02976
Cited By
Anytime Stochastic Gradient Descent: A Time to Hear from all the Workers
6 October 2018
Nuwan S. Ferdinand
S. Draper
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Anytime Stochastic Gradient Descent: A Time to Hear from all the Workers"
7 / 7 papers shown
Title
Lightweight Projective Derivative Codes for Compressed Asynchronous Gradient Descent
Pedro Soto
Ilia Ilmer
Haibin Guan
Jun Li
38
3
0
31 Jan 2022
Gradient Coding with Dynamic Clustering for Straggler-Tolerant Distributed Learning
Baturalp Buyukates
Emre Ozfatura
S. Ulukus
Deniz Gunduz
35
15
0
01 Mar 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
Diversity/Parallelism Trade-off in Distributed Systems with Redundancy
Pei Peng
E. Soljanin
P. Whiting
30
14
0
05 Oct 2020
Coded Distributed Computing with Partial Recovery
Emre Ozfatura
S. Ulukus
Deniz Gunduz
38
28
0
04 Jul 2020
Distributed Training of Deep Neural Networks: Theoretical and Practical Limits of Parallel Scalability
J. Keuper
Franz-Josef Pfreundt
GNN
55
97
0
22 Sep 2016
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
186
683
0
07 Dec 2010
1