Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2102.02336
Cited By
On the Approximation Power of Two-Layer Networks of Random ReLUs
3 February 2021
Daniel J. Hsu
Clayton Sanford
Rocco A. Servedio
Emmanouil-Vasileios Vlatakis-Gkaragkounis
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On the Approximation Power of Two-Layer Networks of Random ReLUs"
10 / 10 papers shown
Title
On the Generalization Properties of Diffusion Models
Puheng Li
Zhong Li
Huishuai Zhang
Jiang Bian
74
29
0
13 Mar 2025
ScoreFusion: Fusing Score-based Generative Models via Kullback-Leibler Barycenters
Hao Liu
Junze Tony Ye
Ye
Jose H. Blanchet
DiffM
FedML
36
1
0
28 Jun 2024
Randomly Initialized One-Layer Neural Networks Make Data Linearly Separable
Promit Ghosal
Srinath Mahankali
Yihang Sun
MLT
29
4
0
24 May 2022
Optimization-Based Separations for Neural Networks
Itay Safran
Jason D. Lee
185
14
0
04 Dec 2021
Quantum machine learning beyond kernel methods
Sofiene Jerbi
Lukas J. Fiderer
Hendrik Poulsen Nautrup
Jonas M. Kubler
H. Briegel
Vedran Dunjko
16
161
0
25 Oct 2021
Expressivity of Neural Networks via Chaotic Itineraries beyond Sharkovsky's Theorem
Clayton Sanford
Vaggos Chatziafratis
16
1
0
19 Oct 2021
Neural Active Learning with Performance Guarantees
Pranjal Awasthi
Christoph Dann
Claudio Gentile
Ayush Sekhari
Zhilei Wang
29
22
0
06 Jun 2021
Depth separation beyond radial functions
Luca Venturi
Samy Jelassi
Tristan Ozuch
Joan Bruna
19
15
0
02 Feb 2021
Approximation by Combinations of ReLU and Squared ReLU Ridge Functions with
ℓ
1
\ell^1
ℓ
1
and
ℓ
0
\ell^0
ℓ
0
Controls
Jason M. Klusowski
Andrew R. Barron
132
142
0
26 Jul 2016
Benefits of depth in neural networks
Matus Telgarsky
153
603
0
14 Feb 2016
1