ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.05544
  4. Cited By
DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial
  Estimation

DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation

14 January 2021
Alexandre Ramé
Matthieu Cord
    FedML
ArXivPDFHTML

Papers citing "DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation"

15 / 15 papers shown
Title
Diversifying Deep Ensembles: A Saliency Map Approach for Enhanced OOD
  Detection, Calibration, and Accuracy
Diversifying Deep Ensembles: A Saliency Map Approach for Enhanced OOD Detection, Calibration, and Accuracy
Stanislav Dereka
I. Karpukhin
Maksim Zhdanov
Sergey Kolesnikov
30
0
0
19 May 2023
Deep Anti-Regularized Ensembles provide reliable out-of-distribution
  uncertainty quantification
Deep Anti-Regularized Ensembles provide reliable out-of-distribution uncertainty quantification
Antoine de Mathelin
Francois Deheeger
Mathilde Mougeot
Nicolas Vayatis
OOD
UQCV
14
2
0
08 Apr 2023
Pathologies of Predictive Diversity in Deep Ensembles
Pathologies of Predictive Diversity in Deep Ensembles
Taiga Abe
E. Kelly Buchanan
Geoff Pleiss
John P. Cunningham
UQCV
38
13
0
01 Feb 2023
Towards Inference Efficient Deep Ensemble Learning
Towards Inference Efficient Deep Ensemble Learning
Ziyue Li
Kan Ren
Yifan Yang
Xinyang Jiang
Yuqing Yang
Dongsheng Li
BDL
21
12
0
29 Jan 2023
A Unified Theory of Diversity in Ensemble Learning
A Unified Theory of Diversity in Ensemble Learning
Danny Wood
Tingting Mu
Andrew M. Webb
Henry W. J. Reeve
M. Luján
Gavin Brown
UQCV
18
41
0
10 Jan 2023
Ensembles for Uncertainty Estimation: Benefits of Prior Functions and
  Bootstrapping
Ensembles for Uncertainty Estimation: Benefits of Prior Functions and Bootstrapping
Vikranth Dwaracherla
Zheng Wen
Ian Osband
Xiuyuan Lu
S. Asghari
Benjamin Van Roy
UQCV
24
16
0
08 Jun 2022
Diverse Lottery Tickets Boost Ensemble from a Single Pretrained Model
Diverse Lottery Tickets Boost Ensemble from a Single Pretrained Model
Sosuke Kobayashi
Shun Kiyono
Jun Suzuki
Kentaro Inui
MoMe
21
7
0
24 May 2022
Towards efficient feature sharing in MIMO architectures
Towards efficient feature sharing in MIMO architectures
Rémy Sun
Alexandre Ramé
Clément Masson
Nicolas Thome
Matthieu Cord
57
6
0
20 May 2022
Reducing Information Bottleneck for Weakly Supervised Semantic
  Segmentation
Reducing Information Bottleneck for Weakly Supervised Semantic Segmentation
Jungbeom Lee
Jooyoung Choi
J. Mok
Sungroh Yoon
SSeg
218
134
0
13 Oct 2021
Repulsive Deep Ensembles are Bayesian
Repulsive Deep Ensembles are Bayesian
Francesco DÁngelo
Vincent Fortuin
UQCV
BDL
46
93
0
22 Jun 2021
Neural Bootstrapper
Neural Bootstrapper
Minsuk Shin
Hyungjoon Cho
Hyun-Seok Min
Sungbin Lim
UQCV
BDL
20
7
0
02 Oct 2020
Anytime Inference with Distilled Hierarchical Neural Ensembles
Anytime Inference with Distilled Hierarchical Neural Ensembles
Adria Ruiz
Jakob Verbeek
UQCV
BDL
FedML
44
6
0
03 Mar 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
192
473
0
12 Jun 2018
Simple and Scalable Predictive Uncertainty Estimation using Deep
  Ensembles
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
270
5,660
0
05 Dec 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,136
0
06 Jun 2015
1