ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.04358
  4. Cited By
An Adaptive Stochastic Gradient Method with Non-negative Gauss-Newton
  Stepsizes

An Adaptive Stochastic Gradient Method with Non-negative Gauss-Newton Stepsizes

5 July 2024
Antonio Orvieto
Lin Xiao
ArXivPDFHTML

Papers citing "An Adaptive Stochastic Gradient Method with Non-negative Gauss-Newton Stepsizes"

4 / 4 papers shown
Title
Anticorrelated Noise Injection for Improved Generalization
Anticorrelated Noise Injection for Improved Generalization
Antonio Orvieto
Hans Kersting
F. Proske
Francis R. Bach
Aurélien Lucchi
42
44
0
06 Feb 2022
L4: Practical loss-based stepsize adaptation for deep learning
L4: Practical loss-based stepsize adaptation for deep learning
Michal Rolínek
Georg Martius
ODL
23
63
0
14 Feb 2018
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
114
1,190
0
16 Aug 2016
A simpler approach to obtaining an O(1/t) convergence rate for the
  projected stochastic subgradient method
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark W. Schmidt
Francis R. Bach
109
253
0
10 Dec 2012
1