19
0

Calibration of Ordinal Regression Networks

Abstract

Recent studies have shown that deep neural networks are not well-calibrated and often produce over-confident predictions. The miscalibration issue primarily stems from using cross-entropy in classifications, which aims to align predicted softmax probabilities with one-hot labels. In ordinal regression tasks, this problem is compounded by an additional challenge: the expectation that softmax probabilities should exhibit unimodal distribution is not met with cross-entropy. The ordinal regression literature has focused on learning orders and overlooked calibration. To address both issues, we propose a novel loss function that introduces ordinal-aware calibration, ensuring that prediction confidence adheres to ordinal relationships between classes. It incorporates soft ordinal encoding and ordinal-aware regularization to enforce both calibration and unimodality. Extensive experiments across four popular ordinal regression benchmarks demonstrate that our approach achieves state-of-the-art calibration without compromising classification accuracy.

View on arXiv
@article{kim2025_2410.15658,
  title={ Calibration of Ordinal Regression Networks },
  author={ Daehwan Kim and Haejun Chung and Ikbeom Jang },
  journal={arXiv preprint arXiv:2410.15658},
  year={ 2025 }
}
Comments on this paper