ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.07355
  4. Cited By
Exploring the Uncertainty Properties of Neural Networks' Implicit Priors
  in the Infinite-Width Limit

Exploring the Uncertainty Properties of Neural Networks' Implicit Priors in the Infinite-Width Limit

14 October 2020
Ben Adlam
Jaehoon Lee
Lechao Xiao
Jeffrey Pennington
Jasper Snoek
    UQCV
    BDL
ArXivPDFHTML

Papers citing "Exploring the Uncertainty Properties of Neural Networks' Implicit Priors in the Infinite-Width Limit"

8 / 8 papers shown
Title
Uncertainty Quantification for Machine Learning in Healthcare: A Survey
Uncertainty Quantification for Machine Learning in Healthcare: A Survey
L. J. L. Lopez
Shaza Elsharief
Dhiyaa Al Jorf
Firas Darwish
Congbo Ma
Farah E. Shamout
85
0
0
04 May 2025
Incorporating Prior Knowledge into Neural Networks through an Implicit
  Composite Kernel
Incorporating Prior Knowledge into Neural Networks through an Implicit Composite Kernel
Ziyang Jiang
Tongshu Zheng
Yiling Liu
David Carlson
15
4
0
15 May 2022
Trust Your Robots! Predictive Uncertainty Estimation of Neural Networks
  with Sparse Gaussian Processes
Trust Your Robots! Predictive Uncertainty Estimation of Neural Networks with Sparse Gaussian Processes
Jongseo Lee
Jianxiang Feng
Matthias Humt
M. Müller
Rudolph Triebel
UQCV
46
21
0
20 Sep 2021
Dataset Distillation with Infinitely Wide Convolutional Networks
Dataset Distillation with Infinitely Wide Convolutional Networks
Timothy Nguyen
Roman Novak
Lechao Xiao
Jaehoon Lee
DD
19
229
0
27 Jul 2021
The large learning rate phase of deep learning: the catapult mechanism
The large learning rate phase of deep learning: the catapult mechanism
Aitor Lewkowycz
Yasaman Bahri
Ethan Dyer
Jascha Narain Sohl-Dickstein
Guy Gur-Ari
ODL
156
234
0
04 Mar 2020
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
348
0
14 Jun 2018
Simple and Scalable Predictive Uncertainty Estimation using Deep
  Ensembles
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
270
5,660
0
05 Dec 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
261
9,134
0
06 Jun 2015
1