86
3

Information Width

Abstract

Kolmogorov argued that the concept of information exists also in problems with no underlying stochastic model (as Shannon's information representation) for instance, the information contained in an algorithm or in the genome. He introduced a combinatorial notion of entropy and information I(x:\sy)I(x:\sy) conveyed by a binary string xx about the unknown value of a variable \sy\sy. The current paper poses the following questions: what is the relationship between the information conveyed by xx about \sy\sy to the description complexity of xx ? is there a notion of cost of information ? are there limits on how efficient xx conveys information ? To answer these questions Kolmogorov's definition is extended and a new concept termed {\em information width} which is similar to nn-widths in approximation theory is introduced. Information of any input source, e.g., sample-based, general side-information or a hybrid of both can be evaluated by a single common formula. An application to the space of binary functions is considered.

View on arXiv
Comments on this paper