Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2309.15244
Cited By
Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks
26 September 2023
Yahong Yang
Qipin Chen
Wenrui Hao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks"
5 / 5 papers shown
Title
Learn Sharp Interface Solution by Homotopy Dynamics
Chuqi Chen
Yahong Yang
Yang Xiang
Wenrui Hao
ODL
57
1
0
01 Feb 2025
Quantifying Training Difficulty and Accelerating Convergence in Neural Network-Based PDE Solvers
Chuqi Chen
Qixuan Zhou
Yahong Yang
Yang Xiang
Tao Luo
29
1
0
08 Oct 2024
Automatic Differentiation is Essential in Training Neural Networks for Solving Differential Equations
Chuqi Chen
Yahong Yang
Yang Xiang
Wenrui Hao
26
2
0
23 May 2024
Neural Tangent Kernel Beyond the Infinite-Width Limit: Effects of Depth and Initialization
Mariia Seleznova
Gitta Kutyniok
179
16
0
01 Feb 2022
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
1