ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.06081
  4. Cited By
Deep Neural Networks Are Effective At Learning High-Dimensional
  Hilbert-Valued Functions From Limited Data
v1v2 (latest)

Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data

Mathematical and Scientific Machine Learning (MSML), 2020
11 December 2020
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
ArXiv (abs)PDFHTML

Papers citing "Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data"

21 / 21 papers shown
Beyond Universal Approximation Theorems: Algorithmic Uniform Approximation by Neural Networks Trained with Noisy Data
Beyond Universal Approximation Theorems: Algorithmic Uniform Approximation by Neural Networks Trained with Noisy Data
Anastasis Kratsios
Tin Sum Cheng
Daniel Roy
AAML
199
0
0
31 Aug 2025
Cauchy Random Features for Operator Learning in Sobolev Space
Cauchy Random Features for Operator Learning in Sobolev Space
Chunyang Liao
Deanna Needell
Hayden Schaeffer
529
3
0
01 Mar 2025
Operator Learning Using Random Features: A Tool for Scientific Computing
Operator Learning Using Random Features: A Tool for Scientific ComputingSIAM Review (SIAM Rev.), 2024
Nicholas H. Nelsen
Andrew M. Stuart
330
26
0
12 Aug 2024
Physics-informed deep learning and compressive collocation for high-dimensional diffusion-reaction equations: practical existence theory and numerics
Physics-informed deep learning and compressive collocation for high-dimensional diffusion-reaction equations: practical existence theory and numerics
Simone Brugiapaglia
N. Dexter
Samir Karam
Weiqi Wang
AI4CEDiffM
440
3
0
03 Jun 2024
Learning smooth functions in high dimensions: from sparse polynomials to
  deep neural networks
Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
308
10
0
04 Apr 2024
Response Theory via Generative Score Modeling
Response Theory via Generative Score Modeling
L. T. Giorgini
Katherine Deck
Tobias Bischoff
Andre N. Souza
326
24
0
01 Feb 2024
A practical existence theorem for reduced order models based on
  convolutional autoencoders
A practical existence theorem for reduced order models based on convolutional autoencoders
N. R. Franco
Simone Brugiapaglia
AI4CE
374
11
0
01 Feb 2024
Do stable neural networks exist for classification problems? -- A new
  view on stability in AI
Do stable neural networks exist for classification problems? -- A new view on stability in AI
Z. N. D. Liu
A. C. Hansen
283
4
0
15 Jan 2024
A unified framework for learning with nonlinear model classes from arbitrary linear samples
A unified framework for learning with nonlinear model classes from arbitrary linear samplesInternational Conference on Machine Learning (ICML), 2023
Ben Adcock
Juan M. Cardenas
N. Dexter
293
5
0
25 Nov 2023
Neural Snowflakes: Universal Latent Graph Inference via Trainable Latent Geometries
Neural Snowflakes: Universal Latent Graph Inference via Trainable Latent GeometriesInternational Conference on Learning Representations (ICLR), 2023
Haitz Sáez de Ocáriz Borde
Anastasis Kratsios
571
6
0
23 Oct 2023
Active Learning for Single Neuron Models with Lipschitz Non-Linearities
Active Learning for Single Neuron Models with Lipschitz Non-LinearitiesInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Aarshvi Gajjar
Chinmay Hegde
Christopher Musco
483
13
0
24 Oct 2022
CAS4DL: Christoffel Adaptive Sampling for function approximation via
  Deep Learning
CAS4DL: Christoffel Adaptive Sampling for function approximation via Deep LearningSampling Theory, Signal Processing, and Data Analysis (TSPDA), 2022
Ben Adcock
Juan M. Cardenas
N. Dexter
261
14
0
25 Aug 2022
Compressive Fourier collocation methods for high-dimensional diffusion
  equations with periodic boundary conditions
Compressive Fourier collocation methods for high-dimensional diffusion equations with periodic boundary conditions
Weiqi Wang
Simone Brugiapaglia
469
2
0
02 Jun 2022
Optimal Learning
Optimal Learning
P. Binev
A. Bonito
Ronald A. DeVore
G. Petrova
FedML
434
1
0
30 Mar 2022
On efficient algorithms for computing near-best polynomial
  approximations to high-dimensional, Hilbert-valued functions from limited
  samples
On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samplesMemoirs of the European Mathematical Society (MMS), 2022
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
237
16
0
25 Mar 2022
A phase transition for finding needles in nonlinear haystacks with LASSO
  artificial neural networks
A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networksStatistics and computing (Stat. Comput.), 2022
Xiaoyu Ma
S. Sardy
N. Hengartner
Nikolai Bobenko
Yen Ting Lin
232
3
0
21 Jan 2022
Convergence Rates for Learning Linear Operators from Noisy Data
Convergence Rates for Learning Linear Operators from Noisy Data
Maarten V. de Hoop
Nikola B. Kovachki
Nicholas H. Nelsen
Andrew M. Stuart
445
68
0
27 Aug 2021
Neural Network Training Using $\ell_1$-Regularization and Bi-fidelity
  Data
Neural Network Training Using ℓ1\ell_1ℓ1​-Regularization and Bi-fidelity DataJournal of Computational Physics (JCP), 2021
Subhayan De
Alireza Doostan
241
30
0
27 May 2021
The Random Feature Model for Input-Output Maps between Banach Spaces
The Random Feature Model for Input-Output Maps between Banach Spaces
Nicholas H. Nelsen
Andrew M. Stuart
497
171
0
20 May 2020
Numerical Solution of the Parametric Diffusion Equation by Deep Neural
  Networks
Numerical Solution of the Parametric Diffusion Equation by Deep Neural NetworksJournal of Scientific Computing (J. Sci. Comput.), 2020
Moritz Geist
P. Petersen
Mones Raslan
R. Schneider
Gitta Kutyniok
255
97
0
25 Apr 2020
The troublesome kernel -- On hallucinations, no free lunches and the
  accuracy-stability trade-off in inverse problems
The troublesome kernel -- On hallucinations, no free lunches and the accuracy-stability trade-off in inverse problemsSIAM Review (SIAM Rev.), 2020
N. Gottschling
Vegard Antun
A. Hansen
Ben Adcock
535
57
0
05 Jan 2020
1
Page 1 of 1