Private Realizable-to-Agnostic Transformation with Near-Optimal Sample Complexity
The realizable-to-agnostic transformation (Beimel et al., 2015; Alon et al., 2020) provides a general mechanism to convert a private learner in the realizable setting (where the examples are labeled by some function in the concept class) to a private learner in the agnostic setting (where no assumptions are imposed on the data). Specifically, for any concept class and error parameter , a private realizable learner for can be transformed into a private agnostic learner while only increasing the sample complexity by , which is essentially tight assuming a constant privacy parameter . However, when can be arbitrary, one has to apply the standard privacy-amplification-by-subsampling technique (Kasiviswanathan et al., 2011), resulting in a suboptimal extra sample complexity of that involves a factor.In this work, we give an improved construction that eliminates the dependence on , thereby achieving a near-optimal extra sample complexity of for any . Moreover, our result reveals that in private agnostic learning, the privacy cost is only significant for the realizable part. We also leverage our technique to obtain a nearly tight sample complexity bound for the private prediction problem, resolving an open question posed by Dwork and Feldman (2018) and Dagan and Feldman (2020).
View on arXiv