Fast and Scalable Score-Based Kernel Calibration Tests
Conference on Uncertainty in Artificial Intelligence (UAI), 2025

Main:10 Pages
15 Figures
Bibliography:4 Pages
Appendix:12 Pages
Abstract
We introduce the Kernel Calibration Conditional Stein Discrepancy test (KCCSD test), a non-parametric, kernel-based test for assessing the calibration of probabilistic models with well-defined scores. In contrast to previous methods, our test avoids the need for possibly expensive expectation approximations while providing control over its type-I error. We achieve these improvements by using a new family of kernels for score-based probabilities that can be estimated without probability density samples, and by using a conditional goodness-of-fit criterion for the KCCSD test's U-statistic. We demonstrate the properties of our test on various synthetic settings.
View on arXivComments on this paper
