ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.15244
  4. Cited By
Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer
  ReLU Neural Networks
v1v2 (latest)

Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks

Journal of Scientific Computing (J. Sci. Comput.), 2023
26 September 2023
Yahong Yang
Qipin Chen
Wenrui Hao
ArXiv (abs)PDFHTML

Papers citing "Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks"

4 / 4 papers shown
Statistical Learning Guarantees for Group-Invariant Barron Functions
Statistical Learning Guarantees for Group-Invariant Barron Functions
Yahong Yang
Wei Zhu
118
1
0
27 Sep 2025
Learn Singularly Perturbed Solutions via Homotopy Dynamics
Learn Singularly Perturbed Solutions via Homotopy Dynamics
Chuqi Chen
Yahong Yang
Yang Xiang
Wenrui Hao
ODL
391
1
0
01 Feb 2025
Quantifying Training Difficulty and Accelerating Convergence in Neural
  Network-Based PDE Solvers
Quantifying Training Difficulty and Accelerating Convergence in Neural Network-Based PDE Solvers
Chuqi Chen
Qixuan Zhou
Yahong Yang
Yang Xiang
Tao Luo
219
7
0
08 Oct 2024
Automatic Differentiation is Essential in Training Neural Networks for Solving Differential Equations
Automatic Differentiation is Essential in Training Neural Networks for Solving Differential EquationsJournal of Scientific Computing (J. Sci. Comput.), 2024
Chuqi Chen
Yahong Yang
Yang Xiang
Wenrui Hao
284
15
0
23 May 2024
1