Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.12478
Cited By
Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes
28 October 2019
Greg Yang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes"
12 / 12 papers shown
Title
Reduced Order Models and Conditional Expectation -- Analysing Parametric Low-Order Approximations
Hermann G. Matthies
35
0
0
17 Feb 2025
Student-t processes as infinite-width limits of posterior Bayesian neural networks
Francesco Caporali
Stefano Favaro
Dario Trevisan
BDL
80
0
0
06 Feb 2025
Deep Kernel Posterior Learning under Infinite Variance Prior Weights
Jorge Loría
A. Bhadra
BDL
UQCV
42
0
0
02 Oct 2024
Function-Space MCMC for Bayesian Wide Neural Networks
Lucia Pezzetti
Stefano Favaro
Stefano Peluchetti
BDL
18
0
0
26 Aug 2024
u-
μ
\mu
μ
P: The Unit-Scaled Maximal Update Parametrization
Charlie Blake
C. Eichenberg
Josef Dean
Lukas Balles
Luke Y. Prince
Bjorn Deiseroth
Andres Felipe Cruz Salinas
Carlo Luschi
Samuel Weinbach
Douglas Orr
38
9
0
24 Jul 2024
Bayesian RG Flow in Neural Network Field Theories
Jessica N. Howard
Marc S. Klinger
Anindita Maiti
A. G. Stapleton
34
1
0
27 May 2024
Random ReLU Neural Networks as Non-Gaussian Processes
Rahul Parhi
Pakshal Bohra
Ayoub El Biari
Mehrsa Pourya
Michael Unser
38
1
0
16 May 2024
Improving Forward Compatibility in Class Incremental Learning by Increasing Representation Rank and Feature Richness
Jaeill Kim
Wonseok Lee
Moonjung Eo
Wonjong Rhee
CLL
27
0
0
22 Mar 2024
Connecting NTK and NNGP: A Unified Theoretical Framework for Wide Neural Network Learning Dynamics
Yehonatan Avidan
Qianyi Li
H. Sompolinsky
28
7
0
08 Sep 2023
Over-parameterised Shallow Neural Networks with Asymmetrical Node Scaling: Global Convergence Guarantees and Feature Learning
François Caron
Fadhel Ayed
Paul Jung
Hoileong Lee
Juho Lee
Hongseok Yang
24
2
0
02 Feb 2023
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
217
330
0
14 Jun 2018
Adversarial Examples, Uncertainty, and Transfer Testing Robustness in Gaussian Process Hybrid Deep Networks
John Bradshaw
A. G. Matthews
Zoubin Ghahramani
BDL
AAML
58
163
0
08 Jul 2017
1