ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.14690
  4. Cited By
On the existence of minimizers in shallow residual ReLU neural network
  optimization landscapes

On the existence of minimizers in shallow residual ReLU neural network optimization landscapes

28 February 2023
Steffen Dereich
Arnulf Jentzen
Sebastian Kassing
ArXivPDFHTML

Papers citing "On the existence of minimizers in shallow residual ReLU neural network optimization landscapes"

3 / 3 papers shown
Title
Non-convergence to global minimizers for Adam and stochastic gradient
  descent optimization and constructions of local minimizers in the training of
  artificial neural networks
Non-convergence to global minimizers for Adam and stochastic gradient descent optimization and constructions of local minimizers in the training of artificial neural networks
Arnulf Jentzen
Adrian Riekert
28
4
0
07 Feb 2024
Approximation results for Gradient Descent trained Shallow Neural
  Networks in $1d$
Approximation results for Gradient Descent trained Shallow Neural Networks in 1d1d1d
R. Gentile
G. Welper
ODL
30
6
0
17 Sep 2022
Convergence of stochastic gradient descent schemes for
  Lojasiewicz-landscapes
Convergence of stochastic gradient descent schemes for Lojasiewicz-landscapes
Steffen Dereich
Sebastian Kassing
26
27
0
16 Feb 2021
1