ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.10030
  4. Cited By
Finite size corrections for neural network Gaussian processes

Finite size corrections for neural network Gaussian processes

27 August 2019
J. Antognini
    BDL
ArXiv (abs)PDFHTML

Papers citing "Finite size corrections for neural network Gaussian processes"

25 / 25 papers shown
Fermions and Supersymmetry in Neural Network Field Theories
Fermions and Supersymmetry in Neural Network Field TheoriesChemical Science (Chem. Sci.), 2025
Samuel Frank
James Halverson
Anindita Maiti
Fabian Ruehle
134
3
0
20 Nov 2025
Stochastic Kernel Regularisation Improves Generalisation in Deep Kernel
  Machines
Stochastic Kernel Regularisation Improves Generalisation in Deep Kernel MachinesNeural Information Processing Systems (NeurIPS), 2024
Edward Milsom
Ben Anson
Laurence Aitchison
266
0
0
08 Oct 2024
Finite Neural Networks as Mixtures of Gaussian Processes: From Provable Error Bounds to Prior Selection
Finite Neural Networks as Mixtures of Gaussian Processes: From Provable Error Bounds to Prior Selection
Steven Adams
A. Patané
Morteza Lahijanian
Luca Laurenti
431
8
0
26 Jul 2024
Bayesian RG Flow in Neural Network Field Theories
Bayesian RG Flow in Neural Network Field Theories
Jessica N. Howard
Marc S. Klinger
Anindita Maiti
A. G. Stapleton
465
8
0
27 May 2024
A note on regularised NTK dynamics with an application to PAC-Bayesian
  training
A note on regularised NTK dynamics with an application to PAC-Bayesian training
Eugenio Clerico
Benjamin Guedj
393
2
0
20 Dec 2023
Depthwise Hyperparameter Transfer in Residual Networks: Dynamics and
  Scaling Limit
Depthwise Hyperparameter Transfer in Residual Networks: Dynamics and Scaling LimitInternational Conference on Learning Representations (ICLR), 2023
Blake Bordelon
Lorenzo Noci
Mufan Li
Boris Hanin
Cengiz Pehlevan
480
51
0
28 Sep 2023
A Primer on Bayesian Neural Networks: Review and Debates
A Primer on Bayesian Neural Networks: Review and Debates
Federico Danieli
Konstantinos Pitas
M. Vladimirova
Vincent Fortuin
BDLAAML
410
40
0
28 Sep 2023
Convolutional Deep Kernel Machines
Convolutional Deep Kernel MachinesInternational Conference on Learning Representations (ICLR), 2023
Edward Milsom
Ben Anson
Laurence Aitchison
BDL
534
6
0
18 Sep 2023
Local Kernel Renormalization as a mechanism for feature learning in
  overparametrized Convolutional Neural Networks
Local Kernel Renormalization as a mechanism for feature learning in overparametrized Convolutional Neural NetworksNature Communications (Nat. Commun.), 2023
R. Aiudi
R. Pacelli
A. Vezzani
R. Burioni
P. Rotondo
MLT
308
29
0
21 Jul 2023
Structures of Neural Network Effective Theories
Structures of Neural Network Effective Theories
cCaugin Ararat
Tianji Cai
Cem Tekin
Zhengkang Zhang
236
14
0
03 May 2023
Non-asymptotic approximations of Gaussian neural networks via second-order Poincaré inequalities
Non-asymptotic approximations of Gaussian neural networks via second-order Poincaré inequalitiesSymposium on Advances in Approximate Bayesian Inference (AABI), 2023
Alberto Bordino
Stefano Favaro
S. Fortini
341
12
0
08 Apr 2023
Bayesian inference with finitely wide neural networks
Bayesian inference with finitely wide neural networksPhysical Review E (PRE), 2023
Chi-Ken Lu
BDL
302
0
0
06 Mar 2023
On Connecting Deep Trigonometric Networks with Deep Gaussian Processes:
  Covariance, Expressivity, and Neural Tangent Kernel
On Connecting Deep Trigonometric Networks with Deep Gaussian Processes: Covariance, Expressivity, and Neural Tangent Kernel
Chi-Ken Lu
Patrick Shafto
BDL
386
1
0
14 Mar 2022
Contrasting random and learned features in deep Bayesian linear
  regression
Contrasting random and learned features in deep Bayesian linear regressionPhysical Review E (Phys. Rev. E), 2022
Jacob A. Zavatone-Veth
William L. Tong
Cengiz Pehlevan
BDLMLT
393
31
0
01 Mar 2022
The edge of chaos: quantum field theory and deep neural networks
The edge of chaos: quantum field theory and deep neural networksSciPost Physics (SciPost Phys.), 2021
Kevin T. Grosvenor
R. Jefferson
262
31
0
27 Sep 2021
A theory of representation learning gives a deep generalisation of
  kernel methods
A theory of representation learning gives a deep generalisation of kernel methodsInternational Conference on Machine Learning (ICML), 2021
Adam X. Yang
Maxime Robeyns
Edward Milsom
Ben Anson
Nandi Schoots
Laurence Aitchison
BDL
669
14
0
30 Aug 2021
Deep Stable neural networks: large-width asymptotics and convergence
  rates
Deep Stable neural networks: large-width asymptotics and convergence rates
Stefano Favaro
S. Fortini
Stefano Peluchetti
BDL
462
17
0
02 Aug 2021
Asymptotics of representation learning in finite Bayesian neural
  networks
Asymptotics of representation learning in finite Bayesian neural networksNeural Information Processing Systems (NeurIPS), 2021
Jacob A. Zavatone-Veth
Abdulkadir Canatar
Benjamin S. Ruben
Cengiz Pehlevan
499
43
0
01 Jun 2021
Non-asymptotic approximations of neural networks by Gaussian processes
Non-asymptotic approximations of neural networks by Gaussian processesAnnual Conference Computational Learning Theory (COLT), 2021
Ronen Eldan
Dan Mikulincer
T. Schramm
327
26
0
17 Feb 2021
Generalization bounds for deep learning
Generalization bounds for deep learning
Guillermo Valle Pérez
A. Louis
BDL
319
48
0
07 Dec 2020
Statistical Mechanics of Deep Linear Neural Networks: The
  Back-Propagating Kernel Renormalization
Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Kernel Renormalization
Qianyi Li
H. Sompolinsky
537
90
0
07 Dec 2020
Neural Networks and Quantum Field Theory
Neural Networks and Quantum Field Theory
James Halverson
Anindita Maiti
Keegan Stoner
392
93
0
19 Aug 2020
Finite Versus Infinite Neural Networks: an Empirical Study
Finite Versus Infinite Neural Networks: an Empirical StudyNeural Information Processing Systems (NeurIPS), 2020
Jaehoon Lee
S. Schoenholz
Jeffrey Pennington
Ben Adlam
Lechao Xiao
Roman Novak
Jascha Narain Sohl-Dickstein
456
232
0
31 Jul 2020
Large Deviation Analysis of Function Sensitivity in Random Deep Neural
  Networks
Large Deviation Analysis of Function Sensitivity in Random Deep Neural Networks
Bo Li
D. Saad
159
12
0
13 Oct 2019
Non-Gaussian processes and neural networks at finite widths
Non-Gaussian processes and neural networks at finite widthsMathematical and Scientific Machine Learning (MSML), 2019
Sho Yaida
401
102
0
30 Sep 2019
1
Page 1 of 1