ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.07235
17
26

Single Model Uncertainty Estimation via Stochastic Data Centering

14 July 2022
Jayaraman J. Thiagarajan
Rushil Anirudh
V. Narayanaswamy
P. Bremer
    UQCV
    OOD
ArXivPDFHTML
Abstract

We are interested in estimating the uncertainties of deep neural networks, which play an important role in many scientific and engineering problems. In this paper, we present a striking new finding that an ensemble of neural networks with the same weight initialization, trained on datasets that are shifted by a constant bias gives rise to slightly inconsistent trained models, where the differences in predictions are a strong indicator of epistemic uncertainties. Using the neural tangent kernel (NTK), we demonstrate that this phenomena occurs in part because the NTK is not shift-invariant. Since this is achieved via a trivial input transformation, we show that this behavior can therefore be approximated by training a single neural network -- using a technique that we call Δ−\Delta-Δ−UQ -- that estimates uncertainty around prediction by marginalizing out the effect of the biases during inference. We show that Δ−\Delta-Δ−UQ's uncertainty estimates are superior to many of the current methods on a variety of benchmarks -- outlier rejection, calibration under distribution shift, and sequential design optimization of black box functions. Code for Δ−\Delta-Δ−UQ can be accessed at https://github.com/LLNL/DeltaUQ

View on arXiv
Comments on this paper