457

Powerful batch conformal prediction for classification

International Conference on Artificial Intelligence and Statistics (AISTATS), 2024
Main:6 Pages
8 Figures
Bibliography:2 Pages
12 Tables
Appendix:26 Pages
Abstract

In a supervised classification split conformal/inductive framework with KK classes, a calibration sample of nn labeled examples is observed for inference on the label of a new unlabeled example. In this work, we explore the case where a "batch" of mm independent such unlabeled examples is given, and a multivariate prediction set with 1α1-\alpha coverage should be provided for this batch. Hence, the batch prediction set takes the form of a collection of label vectors of size mm, while the calibration sample only contains univariate labels. Using the Bonferroni correction consists in concatenating the individual prediction sets at level 1α/m1-\alpha/m (Vovk 2013). We propose a uniformly more powerful solution, based on specific combinations of conformal pp-values that exploit the Simes inequality (Simes 1986). Intuitively, the pooled evidence of fairly "easy" examples of the batch can help provide narrower batch prediction sets. We also introduced adaptive versions of the novel procedure that are particularly effective when the batch prediction set is expected to be large. The theoretical guarantees are provided when all examples are iid, as well as more generally when iid is assumed only conditionally within each class. In particular, our results are also valid under a label distribution shift since the distribution of the labels need not be the same in the calibration sample and in the new `batch'. The usefulness of the method is illustrated on synthetic and real data examples.

View on arXiv
Comments on this paper