ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.13443
16
3

Retiring ΔΔΔDP: New Distribution-Level Metrics for Demographic Parity

31 January 2023
Xiaotian Han
Zhimeng Jiang
Hongye Jin
Zirui Liu
Na Zou
Qifan Wang
Xia Hu
ArXivPDFHTML
Abstract

Demographic parity is the most widely recognized measure of group fairness in machine learning, which ensures equal treatment of different demographic groups. Numerous works aim to achieve demographic parity by pursuing the commonly used metric ΔDP\Delta DPΔDP. Unfortunately, in this paper, we reveal that the fairness metric ΔDP\Delta DPΔDP can not precisely measure the violation of demographic parity, because it inherently has the following drawbacks: i) zero-value ΔDP\Delta DPΔDP does not guarantee zero violation of demographic parity, ii) ΔDP\Delta DPΔDP values can vary with different classification thresholds. To this end, we propose two new fairness metrics, Area Between Probability density function Curves (ABPC) and Area Between Cumulative density function Curves (ABCC), to precisely measure the violation of demographic parity at the distribution level. The new fairness metrics directly measure the difference between the distributions of the prediction probability for different demographic groups. Thus our proposed new metrics enjoy: i) zero-value ABCC/ABPC guarantees zero violation of demographic parity; ii) ABCC/ABPC guarantees demographic parity while the classification thresholds are adjusted. We further re-evaluate the existing fair models with our proposed fairness metrics and observe different fairness behaviors of those models under the new metrics. The code is available at https://github.com/ahxt/new_metric_for_demographic_parity

View on arXiv
Comments on this paper