8
5

Metric Entropy Duality and the Sample Complexity of Outcome Indistinguishability

Abstract

We give the first sample complexity characterizations for outcome indistinguishability, a theoretical framework of machine learning recently introduced by Dwork, Kim, Reingold, Rothblum, and Yona (STOC 2021). In outcome indistinguishability, the goal of the learner is to output a predictor that cannot be distinguished from the target predictor by a class DD of distinguishers examining the outcomes generated according to the predictors' predictions. In the distribution-specific and realizable setting where the learner is given the data distribution together with a predictor class PP containing the target predictor, we show that the sample complexity of outcome indistinguishability is characterized by the metric entropy of PP w.r.t. the dual Minkowski norm defined by DD, and equivalently by the metric entropy of DD w.r.t. the dual Minkowski norm defined by PP. This equivalence makes an intriguing connection to the long-standing metric entropy duality conjecture in convex geometry. Our sample complexity characterization implies a variant of metric entropy duality, which we show is nearly tight. In the distribution-free setting, we focus on the case considered by Dwork et al. where PP contains all possible predictors, hence the sample complexity only depends on DD. In this setting, we show that the sample complexity of outcome indistinguishability is characterized by the fat-shattering dimension of DD. We also show a strong sample complexity separation between realizable and agnostic outcome indistinguishability in both the distribution-free and the distribution-specific settings. This is in contrast to distribution-free (resp. distribution-specific) PAC learning where the sample complexity in both the realizable and the agnostic settings can be characterized by the VC dimension (resp. metric entropy).

View on arXiv
Comments on this paper