34
11

Average-Case Information Complexity of Learning

Abstract

How many bits of information are revealed by a learning algorithm for a concept class of VC-dimension dd? Previous works have shown that even for d=1d=1 the amount of information may be unbounded (tend to \infty with the universe size). Can it be that all concepts in the class require leaking a large amount of information? We show that typically concepts do not require leakage. There exists a proper learning algorithm that reveals O(d)O(d) bits of information for most concepts in the class. This result is a special case of a more general phenomenon we explore. If there is a low information learner when the algorithm {\em knows} the underlying distribution on inputs, then there is a learner that reveals little information on an average concept {\em without knowing} the distribution on inputs.

View on arXiv
Comments on this paper