26
0

Outperformance Score: A Universal Standardization Method for Confusion-Matrix-Based Classification Performance Metrics

Abstract

Many classification performance metrics exist, each suited to a specific application. However, these metrics often differ in scale and can exhibit varying sensitivity to class imbalance rates in the test set. As a result, it is difficult to use the nominal values of these metrics to interpret and evaluate classification performances, especially when imbalance rates vary. To address this problem, we introduce the outperformance score function, a universal standardization method for confusion-matrix-based classification performance (CMBCP) metrics. It maps any given metric to a common scale of [0,1][0,1], while providing a clear and consistent interpretation. Specifically, the outperformance score represents the percentile rank of the observed classification performance within a reference distribution of possible performances. This unified framework enables meaningful comparison and monitoring of classification performance across test sets with differing imbalance rates. We illustrate how the outperformance scores can be applied to a variety of commonly used classification performance metrics and demonstrate the robustness of our method through experiments on real-world datasets spanning multiple classification applications.

View on arXiv
@article{zhao2025_2505.07033,
  title={ Outperformance Score: A Universal Standardization Method for Confusion-Matrix-Based Classification Performance Metrics },
  author={ Ningsheng Zhao and Trang Bui and Jia Yuan Yu and Krzysztof Dzieciolowski },
  journal={arXiv preprint arXiv:2505.07033},
  year={ 2025 }
}
Comments on this paper