ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.09342
  4. Cited By
Random Reshuffling with Variance Reduction: New Analysis and Better
  Rates

Random Reshuffling with Variance Reduction: New Analysis and Better Rates

19 April 2021
Grigory Malinovsky
Alibek Sailanbayev
Peter Richtárik
ArXivPDFHTML

Papers citing "Random Reshuffling with Variance Reduction: New Analysis and Better Rates"

4 / 4 papers shown
Title
Variance Reduction Methods Do Not Need to Compute Full Gradients: Improved Efficiency through Shuffling
Variance Reduction Methods Do Not Need to Compute Full Gradients: Improved Efficiency through Shuffling
Daniil Medyakov
Gleb Molodtsov
S. Chezhegov
Alexey Rebrikov
Aleksandr Beznosikov
96
0
0
21 Feb 2025
High Probability Guarantees for Random Reshuffling
High Probability Guarantees for Random Reshuffling
Hengxu Yu
Xiao Li
27
2
0
20 Nov 2023
Federated Random Reshuffling with Compression and Variance Reduction
Federated Random Reshuffling with Compression and Variance Reduction
Grigory Malinovsky
Peter Richtárik
FedML
16
10
0
08 May 2022
Surpassing Gradient Descent Provably: A Cyclic Incremental Method with
  Linear Convergence Rate
Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
Aryan Mokhtari
Mert Gurbuzbalaban
Alejandro Ribeiro
22
36
0
01 Nov 2016
1