ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.14758
  4. Cited By
Second-order Information Promotes Mini-Batch Robustness in
  Variance-Reduced Gradients

Second-order Information Promotes Mini-Batch Robustness in Variance-Reduced Gradients

23 April 2024
Sachin Garg
A. Berahas
Michal Dereziñski
ArXivPDFHTML

Papers citing "Second-order Information Promotes Mini-Batch Robustness in Variance-Reduced Gradients"

3 / 3 papers shown
Title
SAPPHIRE: Preconditioned Stochastic Variance Reduction for Faster Large-Scale Statistical Learning
Jingruo Sun
Zachary Frangella
Madeleine Udell
31
0
0
28 Jan 2025
Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex
  Optimization
Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex Optimization
Zeyuan Allen-Zhu
ODL
42
52
0
12 Feb 2018
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
76
736
0
19 Mar 2014
1