26
0

CALF: A Conditionally Adaptive Loss Function to Mitigate Class-Imbalanced Segmentation

Abstract

Imbalanced datasets pose a considerable challenge in training deep learning (DL) models for medical diagnostics, particularly for segmentation tasks. Imbalance may be associated with annotation quality limited annotated datasets, rare cases, or small-scale regions of interest (ROIs). These conditions adversely affect model training and performance, leading to segmentation boundaries which deviate from the true ROIs. Traditional loss functions, such as Binary Cross Entropy, replicate annotation biases and limit model generalization. We propose a novel, statistically driven, conditionally adaptive loss function (CALF) tailored to accommodate the conditions of imbalanced datasets in DL training. It employs a data-driven methodology by estimating imbalance severity using statistical methods of skewness and kurtosis, then applies an appropriate transformation to balance the training dataset while preserving data heterogeneity. This transformative approach integrates a multifaceted process, encompassing preprocessing, dataset filtering, and dynamic loss selection to achieve optimal outcomes. We benchmark our method against conventional loss functions using qualitative and quantitative evaluations. Experiments using large-scale open-source datasets (i.e., UPENN-GBM, UCSF, LGG, and BraTS) validate our approach, demonstrating substantial segmentation improvements. Code availability:this https URL.

View on arXiv
@article{alam2025_2504.04458,
  title={ CALF: A Conditionally Adaptive Loss Function to Mitigate Class-Imbalanced Segmentation },
  author={ Bashir Alam and Masa Cirkovic and Mete Harun Akcay and Md Kaf Shahrier and Sebastien Lafond and Hergys Rexha and Kurt Benke and Sepinoud Azimi and Janan Arslan },
  journal={arXiv preprint arXiv:2504.04458},
  year={ 2025 }
}
Comments on this paper