ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.01562
  4. Cited By
Random Neural Networks in the Infinite Width Limit as Gaussian Processes

Random Neural Networks in the Infinite Width Limit as Gaussian Processes

4 July 2021
Boris Hanin
    BDL
ArXivPDFHTML

Papers citing "Random Neural Networks in the Infinite Width Limit as Gaussian Processes"

26 / 26 papers shown
Title
Information-theoretic reduction of deep neural networks to linear models in the overparametrized proportional regime
Information-theoretic reduction of deep neural networks to linear models in the overparametrized proportional regime
Francesco Camilli
D. Tieplova
Eleonora Bergamin
Jean Barbier
71
0
0
06 May 2025
Fractal and Regular Geometry of Deep Neural Networks
Fractal and Regular Geometry of Deep Neural Networks
Simmaco Di Lillo
Domenico Marinucci
Michele Salvi
S. Vigogna
MDE
AI4CE
29
0
0
08 Apr 2025
Deep Neural Nets as Hamiltonians
Deep Neural Nets as Hamiltonians
Mike Winer
Boris Hanin
73
0
0
31 Mar 2025
Neural Tangent Kernel of Neural Networks with Loss Informed by Differential Operators
Weiye Gan
Yicheng Li
Q. Lin
Zuoqiang Shi
34
0
0
14 Mar 2025
Effective Non-Random Extreme Learning Machine
Effective Non-Random Extreme Learning Machine
Daniela De Canditiis
Fabiano Veglianti
66
0
0
25 Nov 2024
Proportional infinite-width infinite-depth limit for deep linear neural
  networks
Proportional infinite-width infinite-depth limit for deep linear neural networks
Federico Bassetti
Lucia Ladelli
P. Rotondo
62
1
0
22 Nov 2024
On the Impacts of the Random Initialization in the Neural Tangent Kernel
  Theory
On the Impacts of the Random Initialization in the Neural Tangent Kernel Theory
Guhan Chen
Yicheng Li
Qian Lin
AAML
30
1
0
08 Oct 2024
Finite Neural Networks as Mixtures of Gaussian Processes: From Provable
  Error Bounds to Prior Selection
Finite Neural Networks as Mixtures of Gaussian Processes: From Provable Error Bounds to Prior Selection
Steven Adams
A. Patané
Morteza Lahijanian
Luca Laurenti
BDL
25
1
0
26 Jul 2024
Wide stable neural networks: Sample regularity, functional convergence
  and Bayesian inverse problems
Wide stable neural networks: Sample regularity, functional convergence and Bayesian inverse problems
Tomás Soto
22
0
0
04 Jul 2024
Large Deviations of Gaussian Neural Networks with ReLU activation
Large Deviations of Gaussian Neural Networks with ReLU activation
Quirin Vogel
19
1
0
27 May 2024
Bayesian RG Flow in Neural Network Field Theories
Bayesian RG Flow in Neural Network Field Theories
Jessica N. Howard
Marc S. Klinger
Anindita Maiti
A. G. Stapleton
60
1
0
27 May 2024
Random ReLU Neural Networks as Non-Gaussian Processes
Random ReLU Neural Networks as Non-Gaussian Processes
Rahul Parhi
Pakshal Bohra
Ayoub El Biari
Mehrsa Pourya
Michael Unser
63
1
0
16 May 2024
Spectral complexity of deep neural networks
Spectral complexity of deep neural networks
Simmaco Di Lillo
Domenico Marinucci
Michele Salvi
S. Vigogna
BDL
66
1
0
15 May 2024
Neural reproducing kernel Banach spaces and representer theorems for
  deep networks
Neural reproducing kernel Banach spaces and representer theorems for deep networks
Francesca Bartolucci
E. De Vito
Lorenzo Rosasco
S. Vigogna
27
4
0
13 Mar 2024
Wide Deep Neural Networks with Gaussian Weights are Very Close to
  Gaussian Processes
Wide Deep Neural Networks with Gaussian Weights are Very Close to Gaussian Processes
Dario Trevisan
UQCV
BDL
17
7
0
18 Dec 2023
Understanding Activation Patterns in Artificial Neural Networks by
  Exploring Stochastic Processes
Understanding Activation Patterns in Artificial Neural Networks by Exploring Stochastic Processes
S. Lehmler
Muhammad Saif-ur-Rehman
Tobias Glasmachers
Ioannis Iossifidis
9
0
0
01 Aug 2023
Quantitative CLTs in Deep Neural Networks
Quantitative CLTs in Deep Neural Networks
Stefano Favaro
Boris Hanin
Domenico Marinucci
I. Nourdin
G. Peccati
BDL
23
11
0
12 Jul 2023
A Quantitative Functional Central Limit Theorem for Shallow Neural
  Networks
A Quantitative Functional Central Limit Theorem for Shallow Neural Networks
Valentina Cammarota
Domenico Marinucci
M. Salvi
S. Vigogna
10
7
0
29 Jun 2023
Gaussian random field approximation via Stein's method with applications
  to wide random neural networks
Gaussian random field approximation via Stein's method with applications to wide random neural networks
Krishnakumar Balasubramanian
L. Goldstein
Nathan Ross
Adil Salim
12
8
0
28 Jun 2023
Structures of Neural Network Effective Theories
Structures of Neural Network Effective Theories
cCaugin Ararat
Tianji Cai
Cem Tekin
Zhengkang Zhang
47
7
0
03 May 2023
Convergence of neural networks to Gaussian mixture distribution
Convergence of neural networks to Gaussian mixture distribution
Yasuhiko Asao
Ryotaro Sakamoto
S. Takagi
BDL
14
2
0
26 Apr 2022
Quantitative Gaussian Approximation of Randomly Initialized Deep Neural
  Networks
Quantitative Gaussian Approximation of Randomly Initialized Deep Neural Networks
Andrea Basteri
Dario Trevisan
BDL
6
21
0
14 Mar 2022
Critical Initialization of Wide and Deep Neural Networks through Partial
  Jacobians: General Theory and Applications
Critical Initialization of Wide and Deep Neural Networks through Partial Jacobians: General Theory and Applications
Darshil Doshi
Tianyu He
Andrey Gromov
20
8
0
23 Nov 2021
Depth induces scale-averaging in overparameterized linear Bayesian
  neural networks
Depth induces scale-averaging in overparameterized linear Bayesian neural networks
Jacob A. Zavatone-Veth
C. Pehlevan
BDL
UQCV
MDE
26
8
0
23 Nov 2021
Rate of Convergence of Polynomial Networks to Gaussian Processes
Rate of Convergence of Polynomial Networks to Gaussian Processes
Adam Klukowski
8
14
0
04 Nov 2021
Non-asymptotic approximations of neural networks by Gaussian processes
Non-asymptotic approximations of neural networks by Gaussian processes
Ronen Eldan
Dan Mikulincer
T. Schramm
28
24
0
17 Feb 2021
1