31
0

Powerful batch conformal prediction for classification

Abstract

In a split conformal framework with KK classes, a calibration sample of nn labeled examples is observed for inference on the label of a new unlabeled example. We explore the setting where a `batch' of mm independent such unlabeled examples is given, and the goal is to construct a batch prediction set with 1-α\alpha coverage. Unlike individual prediction sets, the batch prediction set is a collection of label vectors of size mm, while the calibration sample consists of univariate labels. A natural approach is to apply the Bonferroni correction, which concatenates individual prediction sets at level 1α/m1-\alpha/m. We propose a uniformly more powerful solution, based on specific combinations of conformal pp-values that exploit the Simes inequality. We provide a general recipe for valid inference with any combinations of conformal pp-values, and compare the performance of several useful choices. Intuitively, the pooled evidence of relatively `easy' examples within the batch can help provide narrower batch prediction sets. Additionally, we introduce a more computationally intensive method that aggregates batch scores and can be even more powerful. The theoretical guarantees are established when all examples are independent and identically distributed (iid), as well as more generally when iid is assumed only conditionally within each class. Notably, our results remain valid under label distribution shift, since the distribution of the labels need not be the same in the calibration sample and in the new batch. The effectiveness of the methods is highlighted through illustrative synthetic and real data examples.

View on arXiv
@article{gazin2025_2411.02239,
  title={ Powerful batch conformal prediction for classification },
  author={ Ulysse Gazin and Ruth Heller and Etienne Roquain and Aldo Solari },
  journal={arXiv preprint arXiv:2411.02239},
  year={ 2025 }
}
Comments on this paper