ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.12760
  4. Cited By
On double-descent in uncertainty quantification in overparametrized
  models

On double-descent in uncertainty quantification in overparametrized models

23 October 2022
Lucas Clarté
Bruno Loureiro
Florent Krzakala
Lenka Zdeborová
    UQCV
ArXivPDFHTML

Papers citing "On double-descent in uncertainty quantification in overparametrized models"

5 / 5 papers shown
Title
A Theory of Non-Linear Feature Learning with One Gradient Step in Two-Layer Neural Networks
A Theory of Non-Linear Feature Learning with One Gradient Step in Two-Layer Neural Networks
Behrad Moniri
Donghwan Lee
Hamed Hassani
Edgar Dobriban
MLT
24
19
0
11 Oct 2023
Performance of Bayesian linear regression in a model with mismatch
Performance of Bayesian linear regression in a model with mismatch
Jean Barbier
Wei-Kuo Chen
D. Panchenko
Manuel Sáenz
32
22
0
14 Jul 2021
Double Trouble in Double Descent : Bias and Variance(s) in the Lazy
  Regime
Double Trouble in Double Descent : Bias and Variance(s) in the Lazy Regime
Stéphane dÁscoli
Maria Refinetti
Giulio Biroli
Florent Krzakala
83
152
0
02 Mar 2020
Simple and Scalable Predictive Uncertainty Estimation using Deep
  Ensembles
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
268
5,652
0
05 Dec 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
247
9,109
0
06 Jun 2015
1