ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.00534
  4. Cited By
On Convergence-Diagnostic based Step Sizes for Stochastic Gradient
  Descent

On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent

1 July 2020
Scott Pesme
Aymeric Dieuleveut
Nicolas Flammarion
ArXiv (abs)PDFHTML

Papers citing "On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent"

4 / 4 papers shown
Title
AutoSGD: Automatic Learning Rate Selection for Stochastic Gradient Descent
AutoSGD: Automatic Learning Rate Selection for Stochastic Gradient Descent
Nikola Surjanovic
Alexandre Bouchard-Côté
Trevor Campbell
30
0
0
27 May 2025
Adaptive Learning Rate and Momentum for Training Deep Neural Networks
Adaptive Learning Rate and Momentum for Training Deep Neural Networks
Zhiyong Hao
Yixuan Jiang
Huihua Yu
H. Chiang
ODL
37
10
0
22 Jun 2021
SVRG Meets AdaGrad: Painless Variance Reduction
SVRG Meets AdaGrad: Painless Variance Reduction
Benjamin Dubois-Taine
Sharan Vaswani
Reza Babanezhad
Mark Schmidt
Simon Lacoste-Julien
61
18
0
18 Feb 2021
Robust, Accurate Stochastic Optimization for Variational Inference
Robust, Accurate Stochastic Optimization for Variational Inference
Akash Kumar Dhaka
Alejandro Catalina
Michael Riis Andersen
Maans Magnusson
Jonathan H. Huggins
Aki Vehtari
71
34
0
01 Sep 2020
1