ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.06081
  4. Cited By
Averaged Adam accelerates stochastic optimization in the training of deep neural network approximations for partial differential equation and optimal control problems

Averaged Adam accelerates stochastic optimization in the training of deep neural network approximations for partial differential equation and optimal control problems

10 January 2025
Steffen Dereich
Arnulf Jentzen
Adrian Riekert
    AI4CE
ArXiv (abs)PDFHTML

Papers citing "Averaged Adam accelerates stochastic optimization in the training of deep neural network approximations for partial differential equation and optimal control problems"

1 / 1 papers shown
Title
PADAM: Parallel averaged Adam reduces the error for stochastic optimization in scientific machine learning
PADAM: Parallel averaged Adam reduces the error for stochastic optimization in scientific machine learning
Arnulf Jentzen
Julian Kranz
Adrian Riekert
ODL
57
0
0
28 May 2025
1