68

Uncertainty Quantification for Deep Regression using Contextualised Normalizing Flows

Adriel Sosa Marco
John Daniel Kirwan
Alexia Toumpa
Simos Gerasimou
Main:9 Pages
8 Figures
Bibliography:3 Pages
4 Tables
Appendix:7 Pages
Abstract

Quantifying uncertainty in deep regression models is important both for understanding the confidence of the model and for safe decision-making in high-risk domains. Existing approaches that yield prediction intervals overlook distributional information, neglecting the effect of multimodal or asymmetric distributions on decision-making. Similarly, full or approximated Bayesian methods, while yielding the predictive posterior density, demand major modifications to the model architecture and retraining. We introduce MCNF, a novel post hoc uncertainty quantification method that produces both prediction intervals and the full conditioned predictive distribution. MCNF operates on top of the underlying trained predictive model; thus, no predictive model retraining is needed. We provide experimental evidence that the MCNF-based uncertainty estimate is well calibrated, is competitive with state-of-the-art uncertainty quantification methods, and provides richer information for downstream decision-making tasks.

View on arXiv
Comments on this paper