ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.02336
  4. Cited By
On the Approximation Power of Two-Layer Networks of Random ReLUs

On the Approximation Power of Two-Layer Networks of Random ReLUs

3 February 2021
Daniel J. Hsu
Clayton Sanford
Rocco A. Servedio
Emmanouil-Vasileios Vlatakis-Gkaragkounis
ArXivPDFHTML

Papers citing "On the Approximation Power of Two-Layer Networks of Random ReLUs"

10 / 10 papers shown
Title
On the Generalization Properties of Diffusion Models
On the Generalization Properties of Diffusion Models
Puheng Li
Zhong Li
Huishuai Zhang
Jiang Bian
74
29
0
13 Mar 2025
ScoreFusion: Fusing Score-based Generative Models via Kullback-Leibler Barycenters
ScoreFusion: Fusing Score-based Generative Models via Kullback-Leibler Barycenters
Hao Liu
Junze Tony Ye
Ye
Jose H. Blanchet
DiffM
FedML
36
1
0
28 Jun 2024
Randomly Initialized One-Layer Neural Networks Make Data Linearly
  Separable
Randomly Initialized One-Layer Neural Networks Make Data Linearly Separable
Promit Ghosal
Srinath Mahankali
Yihang Sun
MLT
29
4
0
24 May 2022
Optimization-Based Separations for Neural Networks
Optimization-Based Separations for Neural Networks
Itay Safran
Jason D. Lee
185
14
0
04 Dec 2021
Quantum machine learning beyond kernel methods
Quantum machine learning beyond kernel methods
Sofiene Jerbi
Lukas J. Fiderer
Hendrik Poulsen Nautrup
Jonas M. Kubler
H. Briegel
Vedran Dunjko
16
161
0
25 Oct 2021
Expressivity of Neural Networks via Chaotic Itineraries beyond
  Sharkovsky's Theorem
Expressivity of Neural Networks via Chaotic Itineraries beyond Sharkovsky's Theorem
Clayton Sanford
Vaggos Chatziafratis
16
1
0
19 Oct 2021
Neural Active Learning with Performance Guarantees
Neural Active Learning with Performance Guarantees
Pranjal Awasthi
Christoph Dann
Claudio Gentile
Ayush Sekhari
Zhilei Wang
29
22
0
06 Jun 2021
Depth separation beyond radial functions
Depth separation beyond radial functions
Luca Venturi
Samy Jelassi
Tristan Ozuch
Joan Bruna
19
15
0
02 Feb 2021
Approximation by Combinations of ReLU and Squared ReLU Ridge Functions
  with $ \ell^1 $ and $ \ell^0 $ Controls
Approximation by Combinations of ReLU and Squared ReLU Ridge Functions with ℓ1 \ell^1 ℓ1 and ℓ0 \ell^0 ℓ0 Controls
Jason M. Klusowski
Andrew R. Barron
132
142
0
26 Jul 2016
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
153
603
0
14 Feb 2016
1