ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.15244
  4. Cited By
Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer
  ReLU Neural Networks

Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks

26 September 2023
Yahong Yang
Qipin Chen
Wenrui Hao
ArXivPDFHTML

Papers citing "Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks"

5 / 5 papers shown
Title
Learn Sharp Interface Solution by Homotopy Dynamics
Learn Sharp Interface Solution by Homotopy Dynamics
Chuqi Chen
Yahong Yang
Yang Xiang
Wenrui Hao
ODL
57
1
0
01 Feb 2025
Quantifying Training Difficulty and Accelerating Convergence in Neural
  Network-Based PDE Solvers
Quantifying Training Difficulty and Accelerating Convergence in Neural Network-Based PDE Solvers
Chuqi Chen
Qixuan Zhou
Yahong Yang
Yang Xiang
Tao Luo
29
1
0
08 Oct 2024
Automatic Differentiation is Essential in Training Neural Networks for Solving Differential Equations
Automatic Differentiation is Essential in Training Neural Networks for Solving Differential Equations
Chuqi Chen
Yahong Yang
Yang Xiang
Wenrui Hao
26
2
0
23 May 2024
Neural Tangent Kernel Beyond the Infinite-Width Limit: Effects of Depth
  and Initialization
Neural Tangent Kernel Beyond the Infinite-Width Limit: Effects of Depth and Initialization
Mariia Seleznova
Gitta Kutyniok
179
16
0
01 Feb 2022
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
1