Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1810.05148
Cited By
Bayesian Deep Convolutional Networks with Many Channels are Gaussian Processes
11 October 2018
Roman Novak
Lechao Xiao
Jaehoon Lee
Yasaman Bahri
Greg Yang
Jiri Hron
Daniel A. Abolafia
Jeffrey Pennington
Jascha Narain Sohl-Dickstein
UQCV
BDL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Bayesian Deep Convolutional Networks with Many Channels are Gaussian Processes"
50 / 228 papers shown
Title
A Probabilistic Representation of Deep Learning for Improving The Information Theoretic Interpretability
Xinjie Lan
Kenneth Barner
FAtt
14
2
0
27 Oct 2020
Wearing a MASK: Compressed Representations of Variable-Length Sequences Using Recurrent Neural Tangent Kernels
Sina Alemohammad
H. Babaei
Randall Balestriero
Matt Y. Cheung
Ahmed Imtiaz Humayun
...
Naiming Liu
Lorenzo Luzi
Jasper Tan
Zichao Wang
Richard G. Baraniuk
9
3
0
27 Oct 2020
Stable ResNet
Soufiane Hayou
Eugenio Clerico
Bo He
George Deligiannidis
Arnaud Doucet
Judith Rousseau
ODL
SSeg
46
51
0
24 Oct 2020
Label-Aware Neural Tangent Kernel: Toward Better Generalization and Local Elasticity
Shuxiao Chen
Hangfeng He
Weijie J. Su
12
23
0
22 Oct 2020
The Ridgelet Prior: A Covariance Function Approach to Prior Specification for Bayesian Neural Networks
Takuo Matsubara
Chris J. Oates
F. Briol
BDL
UQCV
13
17
0
16 Oct 2020
Exploring the Uncertainty Properties of Neural Networks' Implicit Priors in the Infinite-Width Limit
Ben Adlam
Jaehoon Lee
Lechao Xiao
Jeffrey Pennington
Jasper Snoek
UQCV
BDL
23
15
0
14 Oct 2020
Temperature check: theory and practice for training models with softmax-cross-entropy losses
Atish Agarwala
Jeffrey Pennington
Yann N. Dauphin
S. Schoenholz
UQCV
16
32
0
14 Oct 2020
Deep kernel processes
Laurence Aitchison
Adam X. Yang
Sebastian W. Ober
BDL
16
41
0
04 Oct 2020
Computational Separation Between Convolutional and Fully-Connected Networks
Eran Malach
Shai Shalev-Shwartz
19
26
0
03 Oct 2020
Tensor Programs III: Neural Matrix Laws
Greg Yang
6
43
0
22 Sep 2020
Asymptotics of Wide Convolutional Neural Networks
Anders Andreassen
Ethan Dyer
14
22
0
19 Aug 2020
Benign Overfitting and Noisy Features
Zhu Li
Weijie Su
Dino Sejdinovic
10
22
0
06 Aug 2020
Finite Versus Infinite Neural Networks: an Empirical Study
Jaehoon Lee
S. Schoenholz
Jeffrey Pennington
Ben Adlam
Lechao Xiao
Roman Novak
Jascha Narain Sohl-Dickstein
17
207
0
31 Jul 2020
Towards Learning Convolutions from Scratch
Behnam Neyshabur
SSL
220
71
0
27 Jul 2020
Bayesian Deep Ensembles via the Neural Tangent Kernel
Bobby He
Balaji Lakshminarayanan
Yee Whye Teh
BDL
UQCV
9
116
0
11 Jul 2020
The Computational Limits of Deep Learning
Neil C. Thompson
Kristjan Greenewald
Keeheon Lee
Gabriel F. Manso
VLM
15
505
0
10 Jul 2020
On the Similarity between the Laplace and Neural Tangent Kernels
Amnon Geifman
A. Yadav
Yoni Kasten
Meirav Galun
David Jacobs
Ronen Basri
13
88
0
03 Jul 2020
Tensor Programs II: Neural Tangent Kernel for Any Architecture
Greg Yang
48
134
0
25 Jun 2020
When Do Neural Networks Outperform Kernel Methods?
Behrooz Ghorbani
Song Mei
Theodor Misiakiewicz
Andrea Montanari
21
184
0
24 Jun 2020
Bayesian Neural Networks: An Introduction and Survey
Ethan Goan
Clinton Fookes
BDL
UQCV
32
199
0
22 Jun 2020
Exact posterior distributions of wide Bayesian neural networks
Jiri Hron
Yasaman Bahri
Roman Novak
Jeffrey Pennington
Jascha Narain Sohl-Dickstein
UQCV
BDL
23
26
0
18 Jun 2020
Infinite attention: NNGP and NTK for deep attention networks
Jiri Hron
Yasaman Bahri
Jascha Narain Sohl-Dickstein
Roman Novak
8
112
0
18 Jun 2020
The Recurrent Neural Tangent Kernel
Sina Alemohammad
Zichao Wang
Randall Balestriero
Richard Baraniuk
AAML
6
77
0
18 Jun 2020
On the training dynamics of deep networks with
L
2
L_2
L
2
regularization
Aitor Lewkowycz
Guy Gur-Ari
36
53
0
15 Jun 2020
Dynamical mean-field theory for stochastic gradient descent in Gaussian mixture classification
Francesca Mignacco
Florent Krzakala
Pierfrancesco Urbani
Lenka Zdeborová
MLT
9
66
0
10 Jun 2020
Halting Time is Predictable for Large Models: A Universality Property and Average-case Analysis
Courtney Paquette
B. V. Merrienboer
Elliot Paquette
Fabian Pedregosa
29
25
0
08 Jun 2020
Adversarial Robustness Guarantees for Random Deep Neural Networks
Giacomo De Palma
B. Kiani
S. Lloyd
AAML
OOD
14
8
0
13 Apr 2020
Reinforcement Learning via Gaussian Processes with Neural Network Dual Kernels
I. Goumiri
Benjamin W. Priest
M. Schneider
GP
BDL
6
6
0
10 Apr 2020
Predicting the outputs of finite deep neural networks trained with noisy gradients
Gadi Naveh
Oded Ben-David
H. Sompolinsky
Z. Ringel
11
20
0
02 Apr 2020
Frequency Bias in Neural Networks for Input of Non-Uniform Density
Ronen Basri
Meirav Galun
Amnon Geifman
David Jacobs
Yoni Kasten
S. Kritchman
34
182
0
10 Mar 2020
Scalable Uncertainty for Computer Vision with Functional Variational Inference
Eduardo D C Carvalho
R. Clark
Andrea Nicastro
Paul H. J. Kelly
BDL
UQCV
128
22
0
06 Mar 2020
Neural Kernels Without Tangents
Vaishaal Shankar
Alex Fang
Wenshuo Guo
Sara Fridovich-Keil
Ludwig Schmidt
Jonathan Ragan-Kelley
Benjamin Recht
17
90
0
04 Mar 2020
The large learning rate phase of deep learning: the catapult mechanism
Aitor Lewkowycz
Yasaman Bahri
Ethan Dyer
Jascha Narain Sohl-Dickstein
Guy Gur-Ari
ODL
159
234
0
04 Mar 2020
Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning via Gaussian Processes
Jilin Hu
Jianbing Shen
B. Yang
Ling Shao
BDL
GNN
34
17
0
26 Feb 2020
Avoiding Kernel Fixed Points: Computing with ELU and GELU Infinite Networks
Russell Tsuchida
Tim Pearce
Christopher van der Heide
Fred Roosta
M. Gallagher
6
8
0
20 Feb 2020
Average-case Acceleration Through Spectral Density Estimation
Fabian Pedregosa
Damien Scieur
14
12
0
12 Feb 2020
Revisiting Spatial Invariance with Low-Rank Local Connectivity
Gamaleldin F. Elsayed
Prajit Ramachandran
Jonathon Shlens
Simon Kornblith
31
43
0
07 Feb 2020
Quasi-Equivalence of Width and Depth of Neural Networks
Fenglei Fan
Rongjie Lai
Ge Wang
14
11
0
06 Feb 2020
On the infinite width limit of neural networks with a standard parameterization
Jascha Narain Sohl-Dickstein
Roman Novak
S. Schoenholz
Jaehoon Lee
24
47
0
21 Jan 2020
Disentangling Trainability and Generalization in Deep Neural Networks
Lechao Xiao
Jeffrey Pennington
S. Schoenholz
6
34
0
30 Dec 2019
Optimization for deep learning: theory and algorithms
Ruoyu Sun
ODL
14
168
0
19 Dec 2019
Neural Tangents: Fast and Easy Infinite Neural Networks in Python
Roman Novak
Lechao Xiao
Jiri Hron
Jaehoon Lee
Alexander A. Alemi
Jascha Narain Sohl-Dickstein
S. Schoenholz
27
224
0
05 Dec 2019
Richer priors for infinitely wide multi-layer perceptrons
Russell Tsuchida
Fred Roosta
M. Gallagher
6
10
0
29 Nov 2019
Inference with Deep Generative Priors in High Dimensions
Jillian R. Fisher
Mojtaba Sahraee-Ardakan
S. Rangan
Zaid Harchaoui
Yejin Choi
BDL
17
46
0
08 Nov 2019
Mean-field inference methods for neural networks
Marylou Gabrié
AI4CE
16
33
0
03 Nov 2019
Enhanced Convolutional Neural Tangent Kernels
Zhiyuan Li
Ruosong Wang
Dingli Yu
S. Du
Wei Hu
Ruslan Salakhutdinov
Sanjeev Arora
16
131
0
03 Nov 2019
Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes
Greg Yang
33
190
0
28 Oct 2019
Explicitly Bayesian Regularizations in Deep Learning
Xinjie Lan
Kenneth Barner
UQCV
BDL
AI4CE
14
1
0
22 Oct 2019
Why bigger is not always better: on finite and infinite neural networks
Laurence Aitchison
175
51
0
17 Oct 2019
Harnessing the Power of Infinitely Wide Deep Nets on Small-data Tasks
Sanjeev Arora
S. Du
Zhiyuan Li
Ruslan Salakhutdinov
Ruosong Wang
Dingli Yu
AAML
14
161
0
03 Oct 2019
Previous
1
2
3
4
5
Next