EOOD: Entropy-based Out-of-distribution Detection

Deep neural networks (DNNs) often exhibit overconfidence when encountering out-of-distribution (OOD) samples, posing significant challenges for deployment. Since DNNs are trained on in-distribution (ID) datasets, the information flow of ID samples through DNNs inevitably differs from that of OOD samples. In this paper, we propose an Entropy-based Out-Of-distribution Detection (EOOD) framework. EOOD first identifies specific block where the information flow differences between ID and OOD samples are more pronounced, using both ID and pseudo-OOD samples. It then calculates the conditional entropy on the selected block as the OOD confidence score. Comprehensive experiments conducted across various ID and OOD settings demonstrate the effectiveness of EOOD in OOD detection and its superiority over state-of-the-art methods.
View on arXiv@article{yang2025_2504.03342, title={ EOOD: Entropy-based Out-of-distribution Detection }, author={ Guide Yang and Chao Hou and Weilong Peng and Xiang Fang and Yongwei Nie and Peican Zhu and Keke Tang }, journal={arXiv preprint arXiv:2504.03342}, year={ 2025 } }