ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.06196
  4. Cited By
Multilevel Minimization for Deep Residual Networks

Multilevel Minimization for Deep Residual Networks

ESAIM Proceedings and Surveys (ESAIM Proc. Surv.), 2020
13 April 2020
Lisa Gaedke-Merzhäuser
Alena Kopanicáková
Rolf Krause
ArXiv (abs)PDFHTML

Papers citing "Multilevel Minimization for Deep Residual Networks"

7 / 7 papers shown
Two-level overlapping additive Schwarz preconditioner for training scientific machine learning applications
Two-level overlapping additive Schwarz preconditioner for training scientific machine learning applications
Youngkyu Lee
Alena Kopanicáková
George Karniadakis
AI4CE
251
3
0
16 Jun 2024
A Multi-Level Framework for Accelerating Training Transformer Models
A Multi-Level Framework for Accelerating Training Transformer Models
Longwei Zou
Han Zhang
Yangdong Deng
AI4CE
284
3
0
07 Apr 2024
Multilevel Objective-Function-Free Optimization with an Application to
  Neural Networks Training
Multilevel Objective-Function-Free Optimization with an Application to Neural Networks TrainingSIAM Journal on Optimization (SIOPT), 2023
Serge Gratton
Alena Kopanicáková
P. Toint
206
11
0
14 Feb 2023
Training of deep residual networks with stochastic MG/OPT
Training of deep residual networks with stochastic MG/OPT
Cyrill Planta
Alena Kopanicáková
Rolf Krause
153
4
0
09 Aug 2021
Globally Convergent Multilevel Training of Deep Residual Networks
Globally Convergent Multilevel Training of Deep Residual Networks
Alena Kopanicáková
Rolf Krause
341
19
0
15 Jul 2021
Spline parameterization of neural network controls for deep learning
Spline parameterization of neural network controls for deep learning
Stefanie Günther
Will Pazner
Dongping Qi
121
4
0
27 Feb 2021
Layer-Parallel Training with GPU Concurrency of Deep Residual Neural
  Networks via Nonlinear Multigrid
Layer-Parallel Training with GPU Concurrency of Deep Residual Neural Networks via Nonlinear MultigridIEEE Conference on High Performance Extreme Computing (HPEC), 2020
Andrew Kirby
S. Samsi
Michael Jones
Albert Reuther
J. Kepner
V. Gadepally
166
15
0
14 Jul 2020
1
Page 1 of 1