ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.06570
  4. Cited By
A Correspondence Between Random Neural Networks and Statistical Field
  Theory

A Correspondence Between Random Neural Networks and Statistical Field Theory

18 October 2017
S. Schoenholz
Jeffrey Pennington
Jascha Narain Sohl-Dickstein
ArXivPDFHTML

Papers citing "A Correspondence Between Random Neural Networks and Statistical Field Theory"

12 / 12 papers shown
Title
Bayesian RG Flow in Neural Network Field Theories
Bayesian RG Flow in Neural Network Field Theories
Jessica N. Howard
Marc S. Klinger
Anindita Maiti
A. G. Stapleton
68
1
0
27 May 2024
Dynamical Isometry based Rigorous Fair Neural Architecture Search
Dynamical Isometry based Rigorous Fair Neural Architecture Search
Jianxiang Luo
Junyi Hu
Tianji Pang
Weihao Huang
Chuan-Hsi Liu
21
0
0
05 Jul 2023
Renormalization in the neural network-quantum field theory
  correspondence
Renormalization in the neural network-quantum field theory correspondence
Harold Erbin
Vincent Lahoche
D. O. Samary
42
7
0
22 Dec 2022
Nonperturbative renormalization for the neural network-QFT
  correspondence
Nonperturbative renormalization for the neural network-QFT correspondence
Harold Erbin
Vincent Lahoche
D. O. Samary
41
30
0
03 Aug 2021
Asymptotics of Wide Networks from Feynman Diagrams
Asymptotics of Wide Networks from Feynman Diagrams
Ethan Dyer
Guy Gur-Ari
29
113
0
25 Sep 2019
Novel Uncertainty Framework for Deep Learning Ensembles
Novel Uncertainty Framework for Deep Learning Ensembles
Tal Kachman
Michal Moshkovitz
Michal Rosen-Zvi
UQCV
OOD
BDL
34
3
0
09 Apr 2019
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
244
349
0
14 Jun 2018
Deep learning generalizes because the parameter-function map is biased
  towards simple functions
Deep learning generalizes because the parameter-function map is biased towards simple functions
Guillermo Valle Pérez
Chico Q. Camargo
A. Louis
MLT
AI4CE
18
226
0
22 May 2018
Replica Symmetry Breaking in Bipartite Spin Glasses and Neural Networks
Replica Symmetry Breaking in Bipartite Spin Glasses and Neural Networks
Gavin Hartnett
Edward Parker
Edward Geist
15
23
0
17 Mar 2018
How to Start Training: The Effect of Initialization and Architecture
How to Start Training: The Effect of Initialization and Architecture
Boris Hanin
David Rolnick
19
253
0
05 Mar 2018
Google's Neural Machine Translation System: Bridging the Gap between
  Human and Machine Translation
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Zhiwen Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
718
6,748
0
26 Sep 2016
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
186
1,186
0
30 Nov 2014
1