ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.00190
  4. Cited By
Approximation Properties of Deep ReLU CNNs

Approximation Properties of Deep ReLU CNNs

1 September 2021
Juncai He
Lin Li
Jinchao Xu
ArXivPDFHTML

Papers citing "Approximation Properties of Deep ReLU CNNs"

26 / 26 papers shown
Title
Higher Order Approximation Rates for ReLU CNNs in Korobov Spaces
Higher Order Approximation Rates for ReLU CNNs in Korobov Spaces
Yuwen Li
Guozhi Zhang
58
1
0
20 Jan 2025
An Interpretive Constrained Linear Model for ResNet and MgNet
An Interpretive Constrained Linear Model for ResNet and MgNet
Juncai He
Jinchao Xu
Lian Zhang
Jianqing Zhu
34
18
0
14 Dec 2021
Characterization of the Variation Spaces Corresponding to Shallow Neural
  Networks
Characterization of the Variation Spaces Corresponding to Shallow Neural Networks
Jonathan W. Siegel
Jinchao Xu
32
43
0
28 Jun 2021
Universal Consistency of Deep Convolutional Neural Networks
Universal Consistency of Deep Convolutional Neural Networks
Shao-Bo Lin
Kaidong Wang
Yao Wang
Ding-Xuan Zhou
29
22
0
23 Jun 2021
ReLU Deep Neural Networks from the Hierarchical Basis Perspective
ReLU Deep Neural Networks from the Hierarchical Basis Perspective
Juncai He
Lin Li
Jinchao Xu
AI4CE
38
30
0
10 May 2021
Universal Approximation Theorem for Equivariant Maps by Group CNNs
Universal Approximation Theorem for Equivariant Maps by Group CNNs
Wataru Kumagai
Akiyoshi Sannai
69
13
0
27 Dec 2020
Statistical theory for image classification using deep convolutional
  neural networks with cross-entropy loss under the hierarchical max-pooling
  model
Statistical theory for image classification using deep convolutional neural networks with cross-entropy loss under the hierarchical max-pooling model
Michael Kohler
S. Langer
29
17
0
27 Nov 2020
PyTorch: An Imperative Style, High-Performance Deep Learning Library
PyTorch: An Imperative Style, High-Performance Deep Learning Library
Adam Paszke
Sam Gross
Francisco Massa
Adam Lerer
James Bradbury
...
Sasank Chilamkurthy
Benoit Steiner
Lu Fang
Junjie Bai
Soumith Chintala
ODL
21
42,038
0
03 Dec 2019
Constrained Linear Data-feature Mapping for Image Classification
Constrained Linear Data-feature Mapping for Image Classification
Juncai He
Yuyan Chen
Lian Zhang
Jinchao Xu
19
2
0
23 Nov 2019
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
Mingxing Tan
Quoc V. Le
3DV
MedIm
27
17,950
0
28 May 2019
Approximation and Non-parametric Estimation of ResNet-type Convolutional
  Neural Networks
Approximation and Non-parametric Estimation of ResNet-type Convolutional Neural Networks
Kenta Oono
Taiji Suzuki
42
58
0
24 Mar 2019
Nonlinear Approximation via Compositions
Nonlinear Approximation via Compositions
Zuowei Shen
Haizhao Yang
Shijun Zhang
33
92
0
26 Feb 2019
Error bounds for approximations with deep ReLU neural networks in
  $W^{s,p}$ norms
Error bounds for approximations with deep ReLU neural networks in Ws,pW^{s,p}Ws,p norms
Ingo Gühring
Gitta Kutyniok
P. Petersen
35
200
0
21 Feb 2019
Equivalence of approximation by convolutional neural networks and
  fully-connected networks
Equivalence of approximation by convolutional neural networks and fully-connected networks
P. Petersen
Felix Voigtländer
20
78
0
04 Sep 2018
Universality of Deep Convolutional Neural Networks
Universality of Deep Convolutional Neural Networks
Ding-Xuan Zhou
HAI
PINN
44
513
0
28 May 2018
The Expressive Power of Neural Networks: A View from the Width
The Expressive Power of Neural Networks: A View from the Width
Zhou Lu
Hongming Pu
Feicheng Wang
Zhiqiang Hu
Liwei Wang
46
886
0
08 Sep 2017
Understanding Deep Neural Networks with Rectified Linear Units
Understanding Deep Neural Networks with Rectified Linear Units
R. Arora
A. Basu
Poorya Mianjy
Anirbit Mukherjee
PINN
112
640
0
04 Nov 2016
Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of
  Dimensionality: a Review
Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality: a Review
T. Poggio
H. Mhaskar
Lorenzo Rosasco
Brando Miranda
Q. Liao
48
575
0
02 Nov 2016
Error bounds for approximations with deep ReLU networks
Error bounds for approximations with deep ReLU networks
Dmitry Yarotsky
56
1,226
0
03 Oct 2016
Densely Connected Convolutional Networks
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
Laurens van der Maaten
Kilian Q. Weinberger
PINN
3DV
437
36,599
0
25 Aug 2016
Approximation by Combinations of ReLU and Squared ReLU Ridge Functions
  with $ \ell^1 $ and $ \ell^0 $ Controls
Approximation by Combinations of ReLU and Squared ReLU Ridge Functions with ℓ1 \ell^1 ℓ1 and ℓ0 \ell^0 ℓ0 Controls
Jason M. Klusowski
Andrew R. Barron
144
142
0
26 Jul 2016
Identity Mappings in Deep Residual Networks
Identity Mappings in Deep Residual Networks
Kaiming He
Xinming Zhang
Shaoqing Ren
Jian Sun
185
10,149
0
16 Mar 2016
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
220
605
0
14 Feb 2016
Deep Residual Learning for Image Recognition
Deep Residual Learning for Image Recognition
Kaiming He
Xinming Zhang
Shaoqing Ren
Jian Sun
MedIm
673
192,638
0
10 Dec 2015
Breaking the Curse of Dimensionality with Convex Neural Networks
Breaking the Curse of Dimensionality with Convex Neural Networks
Francis R. Bach
34
701
0
30 Dec 2014
On the Number of Linear Regions of Deep Neural Networks
On the Number of Linear Regions of Deep Neural Networks
Guido Montúfar
Razvan Pascanu
Kyunghyun Cho
Yoshua Bengio
43
1,249
0
08 Feb 2014
1