ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1803.07964
  4. Cited By
Stochastic Learning under Random Reshuffling with Constant Step-sizes

Stochastic Learning under Random Reshuffling with Constant Step-sizes

21 March 2018
Bicheng Ying
Kun Yuan
Stefan Vlaski
Ali H. Sayed
ArXivPDFHTML

Papers citing "Stochastic Learning under Random Reshuffling with Constant Step-sizes"

10 / 10 papers shown
Title
Distributed Random Reshuffling Methods with Improved Convergence
Distributed Random Reshuffling Methods with Improved Convergence
Kun-Yen Huang
Linli Zhou
Shi Pu
29
4
0
21 Jun 2023
Federated Random Reshuffling with Compression and Variance Reduction
Federated Random Reshuffling with Compression and Variance Reduction
Grigory Malinovsky
Peter Richtárik
FedML
36
10
0
08 May 2022
Distributed Random Reshuffling over Networks
Distributed Random Reshuffling over Networks
Kun-Yen Huang
Xiao Li
Andre Milzarek
Shi Pu
Junwen Qiu
43
11
0
31 Dec 2021
Random-reshuffled SARAH does not need a full gradient computations
Random-reshuffled SARAH does not need a full gradient computations
Aleksandr Beznosikov
Martin Takáč
31
7
0
26 Nov 2021
Random Shuffling Beats SGD Only After Many Epochs on Ill-Conditioned
  Problems
Random Shuffling Beats SGD Only After Many Epochs on Ill-Conditioned Problems
Itay Safran
Ohad Shamir
33
19
0
12 Jun 2021
Incremental Without Replacement Sampling in Nonconvex Optimization
Incremental Without Replacement Sampling in Nonconvex Optimization
Edouard Pauwels
38
5
0
15 Jul 2020
Random Reshuffling: Simple Analysis with Vast Improvements
Random Reshuffling: Simple Analysis with Vast Improvements
Konstantin Mishchenko
Ahmed Khaled
Peter Richtárik
42
131
0
10 Jun 2020
Variance-Reduced Decentralized Stochastic Optimization with Gradient
  Tracking -- Part II: GT-SVRG
Variance-Reduced Decentralized Stochastic Optimization with Gradient Tracking -- Part II: GT-SVRG
Ran Xin
U. Khan
S. Kar
26
8
0
08 Oct 2019
How Good is SGD with Random Shuffling?
How Good is SGD with Random Shuffling?
Itay Safran
Ohad Shamir
25
80
0
31 Jul 2019
On the Fundamental Limits of Coded Data Shuffling for Distributed
  Machine Learning
On the Fundamental Limits of Coded Data Shuffling for Distributed Machine Learning
Adel M. Elmahdy
S. Mohajer
FedML
21
15
0
11 Jul 2018
1