ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.18707
  4. Cited By
Finite Neural Networks as Mixtures of Gaussian Processes: From Provable
  Error Bounds to Prior Selection

Finite Neural Networks as Mixtures of Gaussian Processes: From Provable Error Bounds to Prior Selection

26 July 2024
Steven Adams
A. Patané
Morteza Lahijanian
Luca Laurenti
    BDL
ArXivPDFHTML

Papers citing "Finite Neural Networks as Mixtures of Gaussian Processes: From Provable Error Bounds to Prior Selection"

3 / 3 papers shown
Title
Non-asymptotic approximations of neural networks by Gaussian processes
Non-asymptotic approximations of neural networks by Gaussian processes
Ronen Eldan
Dan Mikulincer
T. Schramm
28
24
0
17 Feb 2021
Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam
Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam
Mohammad Emtiyaz Khan
Didrik Nielsen
Voot Tangkaratt
Wu Lin
Y. Gal
Akash Srivastava
ODL
74
266
0
13 Jun 2018
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
247
9,109
0
06 Jun 2015
1