ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.02001
  4. Cited By
A PDE-based Explanation of Extreme Numerical Sensitivities and Edge of
  Stability in Training Neural Networks

A PDE-based Explanation of Extreme Numerical Sensitivities and Edge of Stability in Training Neural Networks

4 June 2022
Yuxin Sun
Dong Lao
G. Sundaramoorthi
A. Yezzi
ArXivPDFHTML

Papers citing "A PDE-based Explanation of Extreme Numerical Sensitivities and Edge of Stability in Training Neural Networks"

4 / 4 papers shown
Title
Understanding Edge-of-Stability Training Dynamics with a Minimalist
  Example
Understanding Edge-of-Stability Training Dynamics with a Minimalist Example
Xingyu Zhu
Zixuan Wang
Xiang Wang
Mo Zhou
Rong Ge
64
35
0
07 Oct 2022
Understanding Gradient Descent on Edge of Stability in Deep Learning
Understanding Gradient Descent on Edge of Stability in Deep Learning
Sanjeev Arora
Zhiyuan Li
A. Panigrahi
MLT
75
88
0
19 May 2022
Channel-Directed Gradients for Optimization of Convolutional Neural
  Networks
Channel-Directed Gradients for Optimization of Convolutional Neural Networks
Dong Lao
Peihao Zhu
Peter Wonka
G. Sundaramoorthi
18
3
0
25 Aug 2020
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
97
1,151
0
04 Mar 2015
1