509
v1v2v3 (latest)

Revisiting Likelihood-Based Out-of-Distribution Detection by Modeling Representations

Scandinavian Conference on Image Analysis (SCIA), 2025
Main:11 Pages
6 Figures
Bibliography:3 Pages
7 Tables
Appendix:4 Pages
Abstract

Out-of-distribution (OOD) detection is critical for ensuring the reliability of deep learning systems, particularly in safety-critical applications. Likelihood-based deep generative models have historically faced criticism for their unsatisfactory performance in OOD detection, often assigning higher likelihood to OOD data than in-distribution samples when applied to image data. In this work, we demonstrate that likelihood is not inherently flawed. Rather, several properties in the images space prohibit likelihood as a valid detection score. Given a sufficiently good likelihood estimator, specifically using the probability flow formulation of a diffusion model, we show that likelihood-based methods can still perform on par with state-of-the-art methods when applied in the representation space of pre-trained encoders. The code of our work can be found at \href\href{this https URL}{\texttt{this https URL}}.

View on arXiv
Comments on this paper