Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.04110
Cited By
A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
8 June 2021
Gadi Naveh
Z. Ringel
SSL
MLT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs"
11 / 11 papers shown
Title
Deep Neural Nets as Hamiltonians
Mike Winer
Boris Hanin
112
0
0
31 Mar 2025
Bayesian RG Flow in Neural Network Field Theories
Jessica N. Howard
Marc S. Klinger
Anindita Maiti
A. G. Stapleton
65
1
0
27 May 2024
Restoring balance: principled under/oversampling of data for optimal classification
Emanuele Loffredo
Mauro Pastore
Simona Cocco
R. Monasson
35
9
0
15 May 2024
Quantitative CLTs in Deep Neural Networks
Stefano Favaro
Boris Hanin
Domenico Marinucci
I. Nourdin
G. Peccati
BDL
23
11
0
12 Jul 2023
Dynamics of Finite Width Kernel and Prediction Fluctuations in Mean Field Neural Networks
Blake Bordelon
C. Pehlevan
MLT
38
29
0
06 Apr 2023
Globally Gated Deep Linear Networks
Qianyi Li
H. Sompolinsky
AI4CE
14
10
0
31 Oct 2022
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks
Blake Bordelon
C. Pehlevan
MLT
24
79
0
19 May 2022
Contrasting random and learned features in deep Bayesian linear regression
Jacob A. Zavatone-Veth
William L. Tong
C. Pehlevan
BDL
MLT
28
26
0
01 Mar 2022
Separation of Scales and a Thermodynamic Description of Feature Learning in Some CNNs
Inbar Seroussi
Gadi Naveh
Z. Ringel
25
49
0
31 Dec 2021
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks
Blake Bordelon
Abdulkadir Canatar
C. Pehlevan
131
200
0
07 Feb 2020
Why bigger is not always better: on finite and infinite neural networks
Laurence Aitchison
173
51
0
17 Oct 2019
1