ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.10852
  4. Cited By
The effective noise of Stochastic Gradient Descent
v1v2v3 (latest)

The effective noise of Stochastic Gradient Descent

20 December 2021
Francesca Mignacco
Pierfrancesco Urbani
ArXiv (abs)PDFHTML

Papers citing "The effective noise of Stochastic Gradient Descent"

6 / 6 papers shown
Title
Convergence, Sticking and Escape: Stochastic Dynamics Near Critical Points in SGD
Convergence, Sticking and Escape: Stochastic Dynamics Near Critical Points in SGD
Dmitry Dudukalov
Artem Logachov
Vladimir Lotov
Timofei Prasolov
Evgeny Prokopenko
Anton Tarasenko
67
0
0
24 May 2025
Deep Linear Network Training Dynamics from Random Initialization: Data, Width, Depth, and Hyperparameter Transfer
Deep Linear Network Training Dynamics from Random Initialization: Data, Width, Depth, and Hyperparameter Transfer
Blake Bordelon
Cengiz Pehlevan
AI4CE
237
1
0
04 Feb 2025
Stochastic Gradient Descent-like relaxation is equivalent to Metropolis
  dynamics in discrete optimization and inference problems
Stochastic Gradient Descent-like relaxation is equivalent to Metropolis dynamics in discrete optimization and inference problems
Maria Chiara Angelini
A. Cavaliere
Raffaele Marino
F. Ricci-Tersenghi
112
5
0
11 Sep 2023
Connecting NTK and NNGP: A Unified Theoretical Framework for Wide Neural Network Learning Dynamics
Connecting NTK and NNGP: A Unified Theoretical Framework for Wide Neural Network Learning Dynamics
Yehonatan Avidan
Qianyi Li
H. Sompolinsky
133
8
0
08 Sep 2023
Rigorous dynamical mean field theory for stochastic gradient descent
  methods
Rigorous dynamical mean field theory for stochastic gradient descent methods
Cédric Gerbelot
Emanuele Troiani
Francesca Mignacco
Florent Krzakala
Lenka Zdeborova
114
29
0
12 Oct 2022
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide
  Neural Networks
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks
Blake Bordelon
Cengiz Pehlevan
MLT
82
85
0
19 May 2022
1