ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.00019
  4. Cited By
Non-Gaussian processes and neural networks at finite widths

Non-Gaussian processes and neural networks at finite widths

30 September 2019
Sho Yaida
ArXivPDFHTML

Papers citing "Non-Gaussian processes and neural networks at finite widths"

28 / 28 papers shown
Title
Deep Neural Nets as Hamiltonians
Deep Neural Nets as Hamiltonians
Mike Winer
Boris Hanin
157
0
0
31 Mar 2025
Equivariant Neural Tangent Kernels
Equivariant Neural Tangent Kernels
Philipp Misof
Pan Kessel
Jan E. Gerken
64
0
0
10 Jun 2024
Bayesian RG Flow in Neural Network Field Theories
Bayesian RG Flow in Neural Network Field Theories
Jessica N. Howard
Marc S. Klinger
Anindita Maiti
A. G. Stapleton
68
1
0
27 May 2024
Random ReLU Neural Networks as Non-Gaussian Processes
Random ReLU Neural Networks as Non-Gaussian Processes
Rahul Parhi
Pakshal Bohra
Ayoub El Biari
Mehrsa Pourya
Michael Unser
63
1
0
16 May 2024
A theory of data variability in Neural Network Bayesian inference
A theory of data variability in Neural Network Bayesian inference
Javed Lindner
David Dahmen
Michael Krämer
M. Helias
BDL
32
1
0
31 Jul 2023
Quantitative CLTs in Deep Neural Networks
Quantitative CLTs in Deep Neural Networks
Stefano Favaro
Boris Hanin
Domenico Marinucci
I. Nourdin
G. Peccati
BDL
33
11
0
12 Jul 2023
Dynamics of Finite Width Kernel and Prediction Fluctuations in Mean
  Field Neural Networks
Dynamics of Finite Width Kernel and Prediction Fluctuations in Mean Field Neural Networks
Blake Bordelon
Cengiz Pehlevan
MLT
38
29
0
06 Apr 2023
Bayes-optimal Learning of Deep Random Networks of Extensive-width
Bayes-optimal Learning of Deep Random Networks of Extensive-width
Hugo Cui
Florent Krzakala
Lenka Zdeborová
BDL
20
35
0
01 Feb 2023
Catapult Dynamics and Phase Transitions in Quadratic Nets
Catapult Dynamics and Phase Transitions in Quadratic Nets
David Meltzer
Junyu Liu
27
9
0
18 Jan 2023
A Solvable Model of Neural Scaling Laws
A Solvable Model of Neural Scaling Laws
A. Maloney
Daniel A. Roberts
J. Sully
36
51
0
30 Oct 2022
Fast Finite Width Neural Tangent Kernel
Fast Finite Width Neural Tangent Kernel
Roman Novak
Jascha Narain Sohl-Dickstein
S. Schoenholz
AAML
22
53
0
17 Jun 2022
Wide Bayesian neural networks have a simple weight posterior: theory and
  accelerated sampling
Wide Bayesian neural networks have a simple weight posterior: theory and accelerated sampling
Jiri Hron
Roman Novak
Jeffrey Pennington
Jascha Narain Sohl-Dickstein
UQCV
BDL
48
6
0
15 Jun 2022
Gaussian Pre-Activations in Neural Networks: Myth or Reality?
Gaussian Pre-Activations in Neural Networks: Myth or Reality?
Pierre Wolinski
Julyan Arbel
AI4CE
73
8
0
24 May 2022
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide
  Neural Networks
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks
Blake Bordelon
Cengiz Pehlevan
MLT
40
78
0
19 May 2022
Analytic theory for the dynamics of wide quantum neural networks
Analytic theory for the dynamics of wide quantum neural networks
Junyu Liu
K. Najafi
Kunal Sharma
F. Tacchino
Liang Jiang
Antonio Mezzacapo
27
52
0
30 Mar 2022
Contrasting random and learned features in deep Bayesian linear
  regression
Contrasting random and learned features in deep Bayesian linear regression
Jacob A. Zavatone-Veth
William L. Tong
Cengiz Pehlevan
BDL
MLT
28
26
0
01 Mar 2022
Separation of Scales and a Thermodynamic Description of Feature Learning
  in Some CNNs
Separation of Scales and a Thermodynamic Description of Feature Learning in Some CNNs
Inbar Seroussi
Gadi Naveh
Zohar Ringel
33
50
0
31 Dec 2021
Rate of Convergence of Polynomial Networks to Gaussian Processes
Rate of Convergence of Polynomial Networks to Gaussian Processes
Adam Klukowski
10
14
0
04 Nov 2021
Conditional Deep Gaussian Processes: empirical Bayes hyperdata learning
Conditional Deep Gaussian Processes: empirical Bayes hyperdata learning
Chi-Ken Lu
Patrick Shafto
BDL
27
4
0
01 Oct 2021
Nonperturbative renormalization for the neural network-QFT
  correspondence
Nonperturbative renormalization for the neural network-QFT correspondence
Harold Erbin
Vincent Lahoche
D. O. Samary
41
30
0
03 Aug 2021
Dataset Distillation with Infinitely Wide Convolutional Networks
Dataset Distillation with Infinitely Wide Convolutional Networks
Timothy Nguyen
Roman Novak
Lechao Xiao
Jaehoon Lee
DD
51
229
0
27 Jul 2021
Random Neural Networks in the Infinite Width Limit as Gaussian Processes
Random Neural Networks in the Infinite Width Limit as Gaussian Processes
Boris Hanin
BDL
32
43
0
04 Jul 2021
A self consistent theory of Gaussian Processes captures feature learning
  effects in finite CNNs
A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Gadi Naveh
Zohar Ringel
SSL
MLT
36
31
0
08 Jun 2021
Explaining Neural Scaling Laws
Explaining Neural Scaling Laws
Yasaman Bahri
Ethan Dyer
Jared Kaplan
Jaehoon Lee
Utkarsh Sharma
27
250
0
12 Feb 2021
Memorizing without overfitting: Bias, variance, and interpolation in
  over-parameterized models
Memorizing without overfitting: Bias, variance, and interpolation in over-parameterized models
J. Rocks
Pankaj Mehta
18
41
0
26 Oct 2020
Mixed Moments for the Product of Ginibre Matrices
Mixed Moments for the Product of Ginibre Matrices
Nick Halmagyi
Shailesh Lal
14
2
0
20 Jul 2020
Predicting the outputs of finite deep neural networks trained with noisy
  gradients
Predicting the outputs of finite deep neural networks trained with noisy gradients
Gadi Naveh
Oded Ben-David
H. Sompolinsky
Zohar Ringel
19
20
0
02 Apr 2020
Quantifying the probable approximation error of probabilistic inference
  programs
Quantifying the probable approximation error of probabilistic inference programs
Marco F. Cusumano-Towner
Vikash K. Mansinghka
33
5
0
31 May 2016
1