ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1610.01145
  4. Cited By
Error bounds for approximations with deep ReLU networks
v1v2v3 (latest)

Error bounds for approximations with deep ReLU networks

3 October 2016
Dmitry Yarotsky
ArXiv (abs)PDFHTML

Papers citing "Error bounds for approximations with deep ReLU networks"

50 / 633 papers shown
A Deep Learning approach to Reduced Order Modelling of Parameter
  Dependent Partial Differential Equations
A Deep Learning approach to Reduced Order Modelling of Parameter Dependent Partial Differential EquationsMathematics of Computation (Math. Comp.), 2021
N. R. Franco
Andrea Manzoni
P. Zunino
234
59
0
10 Mar 2021
Deep neural network approximation for high-dimensional parabolic
  Hamilton-Jacobi-Bellman equations
Deep neural network approximation for high-dimensional parabolic Hamilton-Jacobi-Bellman equations
Philipp Grohs
L. Herrmann
181
8
0
09 Mar 2021
Data-driven Prediction of General Hamiltonian Dynamics via Learning
  Exactly-Symplectic Maps
Data-driven Prediction of General Hamiltonian Dynamics via Learning Exactly-Symplectic MapsInternational Conference on Machine Learning (ICML), 2021
Ren-Chuen Chen
Molei Tao
151
66
0
09 Mar 2021
Parametric Complexity Bounds for Approximating PDEs with Neural Networks
Parametric Complexity Bounds for Approximating PDEs with Neural NetworksNeural Information Processing Systems (NeurIPS), 2021
Tanya Marwah
Zachary Chase Lipton
Andrej Risteski
210
20
0
03 Mar 2021
Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse
  of Dimensionality in Approximation on Hölder Class
Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder ClassSIAM Journal on Mathematical Analysis (SIAM J. Math. Anal.), 2021
Yuling Jiao
Yanming Lai
Xiliang Lu
Fengru Wang
J. Yang
Yuanyuan Yang
188
5
0
28 Feb 2021
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Optimal Approximation Rate of ReLU Networks in terms of Width and DepthJournal des Mathématiques Pures et Appliquées (JMPA), 2021
Zuowei Shen
Haizhao Yang
Shijun Zhang
551
144
0
28 Feb 2021
Consistent Sparse Deep Learning: Theory and Computation
Consistent Sparse Deep Learning: Theory and ComputationJournal of the American Statistical Association (JASA), 2021
Y. Sun
Qifan Song
F. Liang
BDL
243
41
0
25 Feb 2021
Quantitative approximation results for complex-valued neural networks
Quantitative approximation results for complex-valued neural networksSIAM Journal on Mathematics of Data Science (SIMODS), 2021
A. Caragea
D. Lee
J. Maly
G. Pfander
F. Voigtlaender
192
8
0
25 Feb 2021
Universal Approximation Theorem for Neural Networks
Universal Approximation Theorem for Neural Networks
Takato Nishijima
40
17
0
19 Feb 2021
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow
  Computation
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow ComputationConference on Integer Programming and Combinatorial Optimization (IPCO), 2021
Christoph Hertrich
Leon Sering
345
12
0
12 Feb 2021
Novel Deep neural networks for solving Bayesian statistical inverse
Novel Deep neural networks for solving Bayesian statistical inverse
Harbir Antil
H. Elman
Akwum Onwunta
Deepanshu Verma
BDL
133
18
0
08 Feb 2021
Depth separation beyond radial functions
Depth separation beyond radial functionsJournal of machine learning research (JMLR), 2021
Luca Venturi
Samy Jelassi
Tristan Ozuch
Joan Bruna
250
17
0
02 Feb 2021
Learning elliptic partial differential equations with randomized linear
  algebra
Learning elliptic partial differential equations with randomized linear algebraFoundations of Computational Mathematics (FoCM), 2021
Nicolas Boullé
Alex Townsend
145
52
0
31 Jan 2021
Size and Depth Separation in Approximating Benign Functions with Neural
  Networks
Size and Depth Separation in Approximating Benign Functions with Neural NetworksAnnual Conference Computational Learning Theory (COLT), 2021
Gal Vardi
Daniel Reichman
T. Pitassi
Ohad Shamir
302
7
0
30 Jan 2021
Exploring Deep Neural Networks via Layer-Peeled Model: Minority Collapse
  in Imbalanced Training
Exploring Deep Neural Networks via Layer-Peeled Model: Minority Collapse in Imbalanced TrainingProceedings of the National Academy of Sciences of the United States of America (PNAS), 2021
Cong Fang
Hangfeng He
Qi Long
Weijie J. Su
FAtt
478
207
0
29 Jan 2021
On the capacity of deep generative networks for approximating
  distributions
On the capacity of deep generative networks for approximating distributionsNeural Networks (NN), 2021
Yunfei Yang
Zhen Li
Yang Wang
251
32
0
29 Jan 2021
Approximation Theory of Tree Tensor Networks: Tensorized Multivariate
  Functions
Approximation Theory of Tree Tensor Networks: Tensorized Multivariate Functions
Mazen Ali
A. Nouy
271
8
0
28 Jan 2021
Partition of unity networks: deep hp-approximation
Partition of unity networks: deep hp-approximation
Kookjin Lee
N. Trask
Ravi G. Patel
Mamikon A. Gulian
E. Cyr
186
33
0
27 Jan 2021
Approximating Probability Distributions by ReLU Networks
Approximating Probability Distributions by ReLU NetworksInformation Theory Workshop (ITW), 2021
Manuj Mukherjee
A. Tchamkerten
Mansoor I. Yousefi
90
0
0
25 Jan 2021
A simple geometric proof for the benefit of depth in ReLU networks
A simple geometric proof for the benefit of depth in ReLU networks
Asaf Amrami
Yoav Goldberg
211
1
0
18 Jan 2021
Reproducing Activation Function for Deep Learning
Reproducing Activation Function for Deep LearningCommunications in Mathematical Sciences (CMS), 2021
Senwei Liang
Liyao Lyu
Chunmei Wang
Haizhao Yang
307
28
0
13 Jan 2021
A Priori Generalization Analysis of the Deep Ritz Method for Solving
  High Dimensional Elliptic Equations
A Priori Generalization Analysis of the Deep Ritz Method for Solving High Dimensional Elliptic Equations
Jianfeng Lu
Yulong Lu
Min Wang
252
37
0
05 Jan 2021
Algorithmic Complexities in Backpropagation and Tropical Neural Networks
Algorithmic Complexities in Backpropagation and Tropical Neural Networks
Ozgur Ceyhan
67
1
0
03 Jan 2021
Approximations with deep neural networks in Sobolev time-space
Approximations with deep neural networks in Sobolev time-spaceAnalysis and Applications (Anal. Appl.), 2020
Ahmed Abdeljawad
Philipp Grohs
137
20
0
23 Dec 2020
Machine Learning Advances for Time Series Forecasting
Machine Learning Advances for Time Series Forecasting
Ricardo P. Masini
M. C. Medeiros
Eduardo F. Mendes
AI4TS
425
393
0
23 Dec 2020
Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers
Reduced Order Modeling using Shallow ReLU Networks with Grassmann LayersMathematical and Scientific Machine Learning (MSML), 2020
K. Bollinger
Hayden Schaeffer
162
3
0
17 Dec 2020
The Variational Method of Moments
The Variational Method of Moments
Andrew Bennett
Nathan Kallus
355
30
0
17 Dec 2020
Strong overall error analysis for the training of artificial neural
  networks via random initializations
Strong overall error analysis for the training of artificial neural networks via random initializationsCommunications in Mathematics and Statistics (Commun. Math. Stat.), 2020
Arnulf Jentzen
Adrian Riekert
217
3
0
15 Dec 2020
Deep Neural Networks Are Effective At Learning High-Dimensional
  Hilbert-Valued Functions From Limited Data
Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited DataMathematical and Scientific Machine Learning (MSML), 2020
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
359
37
0
11 Dec 2020
Data-driven Method for Estimating Aircraft Mass from Quick Access
  Recorder using Aircraft Dynamics and Multilayer Perceptron Neural Network
Data-driven Method for Estimating Aircraft Mass from Quick Access Recorder using Aircraft Dynamics and Multilayer Perceptron Neural Network
Xinyu He
Fang He
Xinting Zhu
Lishuai Li
64
3
0
10 Dec 2020
The Representation Power of Neural Networks: Breaking the Curse of
  Dimensionality
The Representation Power of Neural Networks: Breaking the Curse of Dimensionality
Moise Blanchard
M. A. Bennouna
162
7
0
10 Dec 2020
Estimation of the Mean Function of Functional Data via Deep Neural
  Networks
Estimation of the Mean Function of Functional Data via Deep Neural Networks
Shuoyang Wang
Guanqun Cao
Zuofeng Shang
158
21
0
08 Dec 2020
A General Computational Framework to Measure the Expressiveness of
  Complex Networks Using a Tighter Upper Bound of Linear Regions
A General Computational Framework to Measure the Expressiveness of Complex Networks Using a Tighter Upper Bound of Linear Regions
Yutong Xie
Gaoxiang Chen
Shijie Zhao
129
3
0
08 Dec 2020
The universal approximation theorem for complex-valued neural networks
The universal approximation theorem for complex-valued neural networksApplied and Computational Harmonic Analysis (ACHA), 2020
F. Voigtlaender
366
72
0
06 Dec 2020
Some observations on high-dimensional partial differential equations
  with Barron data
Some observations on high-dimensional partial differential equations with Barron dataMathematical and Scientific Machine Learning (MSML), 2020
E. Weinan
Stephan Wojtowytsch
AI4CE
336
23
0
02 Dec 2020
A Convenient Infinite Dimensional Framework for Generative Adversarial
  Learning
A Convenient Infinite Dimensional Framework for Generative Adversarial LearningElectronic Journal of Statistics (EJS), 2020
H. Asatryan
Hanno Gottschalk
Marieke Lippert
Matthias Rottmann
GAN
241
14
0
24 Nov 2020
Numerically Solving Parametric Families of High-Dimensional Kolmogorov
  Partial Differential Equations via Deep Learning
Numerically Solving Parametric Families of High-Dimensional Kolmogorov Partial Differential Equations via Deep Learning
Julius Berner
Markus Dablander
Philipp Grohs
189
55
0
09 Nov 2020
Advantage of Deep Neural Networks for Estimating Functions with
  Singularity on Hypersurfaces
Advantage of Deep Neural Networks for Estimating Functions with Singularity on Hypersurfaces
Masaaki Imaizumi
Kenji Fukumizu
253
20
0
04 Nov 2020
Learning Barrier Functions with Memory for Robust Safe Navigation
Learning Barrier Functions with Memory for Robust Safe Navigation
Kehan Long
Cheng Qian
Jorge Cortés
Nikolay Atanasov
271
75
0
03 Nov 2020
Identification of complex mixtures for Raman spectroscopy using a novel
  scheme based on a new multi-label deep neural network
Identification of complex mixtures for Raman spectroscopy using a novel scheme based on a new multi-label deep neural networkIEEE Sensors Journal (IEEE Sens. J.), 2020
Liangrui Pan
Pronthep Pipitsunthonsan
C. Daengngam
M. Chongcheawchamnan
138
15
0
29 Oct 2020
The ELBO of Variational Autoencoders Converges to a Sum of Three
  Entropies
The ELBO of Variational Autoencoders Converges to a Sum of Three EntropiesInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2020
Simon Damm
D. Forster
Dmytro Velychko
Zhenwen Dai
Asja Fischer
Jörg Lücke
DRL
567
8
0
28 Oct 2020
Deep Learning for Individual Heterogeneity
Deep Learning for Individual Heterogeneity
M. Farrell
Tengyuan Liang
S. Misra
BDL
380
17
0
28 Oct 2020
The Teaching Dimension of Kernel Perceptron
The Teaching Dimension of Kernel PerceptronInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2020
Akash Kumar
Hanqi Zhang
Adish Singla
Yuxin Chen
274
8
0
27 Oct 2020
Neural Network Approximation: Three Hidden Layers Are Enough
Neural Network Approximation: Three Hidden Layers Are EnoughNeural Networks (NN), 2020
Zuowei Shen
Haizhao Yang
Shijun Zhang
417
138
0
25 Oct 2020
Exponential ReLU Neural Network Approximation Rates for Point and Edge
  Singularities
Exponential ReLU Neural Network Approximation Rates for Point and Edge SingularitiesFoundations of Computational Mathematics (FoCM), 2020
C. Marcati
J. Opschoor
P. Petersen
Christoph Schwab
174
34
0
23 Oct 2020
On the Number of Linear Functions Composing Deep Neural Network: Towards
  a Refined Definition of Neural Networks Complexity
On the Number of Linear Functions Composing Deep Neural Network: Towards a Refined Definition of Neural Networks Complexity
Yuuki Takai
Akiyoshi Sannai
Matthieu Cordonnier
284
4
0
23 Oct 2020
Binary Choice with Asymmetric Loss in a Data-Rich Environment: Theory
  and an Application to Racial Justice
Binary Choice with Asymmetric Loss in a Data-Rich Environment: Theory and an Application to Racial Justice
Andrii Babii
Xi Chen
Eric Ghysels
Rohit Kumar
FaML
401
9
0
16 Oct 2020
Quantile regression with deep ReLU Networks: Estimators and minimax
  rates
Quantile regression with deep ReLU Networks: Estimators and minimax ratesJournal of machine learning research (JMLR), 2020
Oscar Hernan Madrid Padilla
Wesley Tansey
Yanzhen Chen
622
37
0
16 Oct 2020
Deep Equals Shallow for ReLU Networks in Kernel Regimes
Deep Equals Shallow for ReLU Networks in Kernel Regimes
A. Bietti
Francis R. Bach
425
98
0
30 Sep 2020
Theoretical Analysis of the Advantage of Deepening Neural Networks
Theoretical Analysis of the Advantage of Deepening Neural NetworksInternational Conference on Machine Learning and Applications (ICMLA), 2020
Yasushi Esaki
Yuta Nakahara
Toshiyasu Matsushima
56
0
0
24 Sep 2020
Previous
123...1011121389
Next
Page 9 of 13
Pageof 13