Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1603.00570
Cited By
Without-Replacement Sampling for Stochastic Gradient Methods: Convergence Results and Application to Distributed Optimization
2 March 2016
Ohad Shamir
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Without-Replacement Sampling for Stochastic Gradient Methods: Convergence Results and Application to Distributed Optimization"
8 / 8 papers shown
Title
Introduction to Online Convex Optimization
Elad Hazan
OffRL
121
1,922
0
07 Sep 2019
Communication-Efficient Distributed Dual Coordinate Ascent
Martin Jaggi
Virginia Smith
Martin Takáč
Jonathan Terhorst
S. Krishnan
Thomas Hofmann
Michael I. Jordan
72
353
0
04 Sep 2014
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
115
1,817
0
01 Jul 2014
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
143
573
0
08 Dec 2012
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
138
1,031
0
10 Sep 2012
Beneath the valley of the noncommutative arithmetic-geometric mean inequality: conjectures, case-studies, and consequences
Benjamin Recht
Christopher Ré
49
51
0
19 Feb 2012
A Reliable Effective Terascale Linear Learning System
Alekh Agarwal
O. Chapelle
Miroslav Dudík
John Langford
88
418
0
19 Oct 2011
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
ODL
111
313
0
22 Jun 2011
1