ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.05451
  4. Cited By
The Representation Power of Neural Networks: Breaking the Curse of
  Dimensionality
v1v2v3 (latest)

The Representation Power of Neural Networks: Breaking the Curse of Dimensionality

10 December 2020
Moise Blanchard
M. A. Bennouna
ArXiv (abs)PDFHTML

Papers citing "The Representation Power of Neural Networks: Breaking the Curse of Dimensionality"

4 / 4 papers shown
Learning smooth functions in high dimensions: from sparse polynomials to
  deep neural networks
Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
210
8
0
04 Apr 2024
Learning Sparsity-Promoting Regularizers using Bilevel Optimization
Learning Sparsity-Promoting Regularizers using Bilevel OptimizationSIAM Journal of Imaging Sciences (SIAM J. Imaging Sci.), 2022
Avrajit Ghosh
Michael T. McCann
Madeline Mitchell
S. Ravishankar
235
5
0
18 Jul 2022
Lyapunov-Net: A Deep Neural Network Architecture for Lyapunov Function
  Approximation
Lyapunov-Net: A Deep Neural Network Architecture for Lyapunov Function ApproximationIEEE Conference on Decision and Control (CDC), 2021
Nathan Gaby
Fumin Zhang
X. Ye
PINN
282
54
0
27 Sep 2021
On the approximation of functions by tanh neural networks
On the approximation of functions by tanh neural networksNeural Networks (NN), 2021
Tim De Ryck
S. Lanthaler
Siddhartha Mishra
269
172
0
18 Apr 2021
1