142

On the Information Complexity of Proper Learners for VC Classes in the Realizable Case

Abstract

We provide a negative resolution to a conjecture of Steinke and Zakynthinou (2020a), by showing that their bound on the conditional mutual information (CMI) of proper learners of Vapnik--Chervonenkis (VC) classes cannot be improved from dlogn+2d \log n +2 to O(d)O(d), where nn is the number of i.i.d. training examples. In fact, we exhibit VC classes for which the CMI of any proper learner cannot be bounded by any real-valued function of the VC dimension only.

View on arXiv
Comments on this paper