Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1803.08841
Cited By
The Convergence of Stochastic Gradient Descent in Asynchronous Shared Memory
23 March 2018
Dan Alistarh
Christopher De Sa
Nikola Konstantinov
Re-assign community
ArXiv
PDF
HTML
Papers citing
"The Convergence of Stochastic Gradient Descent in Asynchronous Shared Memory"
5 / 5 papers shown
Title
Consistent Lock-free Parallel Stochastic Gradient Descent for Fast and Stable Convergence
Karl Bäckström
Ivan Walulya
Marina Papatriantafilou
P. Tsigas
29
5
0
17 Feb 2021
MixML: A Unified Analysis of Weakly Consistent Parallel Learning
Yucheng Lu
J. Nash
Christopher De Sa
FedML
32
12
0
14 May 2020
Weighted Aggregating Stochastic Gradient Descent for Parallel Deep Learning
Pengzhan Guo
Zeyang Ye
Keli Xiao
Wei Zhu
24
14
0
07 Apr 2020
Distributed Hierarchical GPU Parameter Server for Massive Scale Deep Learning Ads Systems
Weijie Zhao
Deping Xie
Ronglai Jia
Yulei Qian
Rui Ding
Mingming Sun
P. Li
MoE
59
151
0
12 Mar 2020
Don't Use Large Mini-Batches, Use Local SGD
Tao R. Lin
Sebastian U. Stich
Kumar Kshitij Patel
Martin Jaggi
57
429
0
22 Aug 2018
1