Universal Densities Exist for Every Finite Reference Measure
As it is known, universal codes, which estimate the entropy rate consistently, exist for stationary ergodic sources over any finite alphabet but not over a countably infinite one. We cast the problem of universal coding into the problem of universal densities with respect to a given reference measure on a countably generated measurable space, examples being the counting measure or the Lebesgue measure. We show that universal densities, which estimate the differential entropy rate consistently, exist if the reference measure is finite, which disproves that the assumption of a finite alphabet is necessary in some sense. To exhibit a universal density, we combine the prediction by partial matching (PPM) code with the recently proposed non-parametric differential (NPD) entropy rate estimator, extended by putting a prior both over all Markov orders and all quantization levels. The proof of universality applies Barron's asymptotic equipartition for densities and continuity of -divergences for filtrations. As an application, we demonstrate that any universal density induces a strongly consistent Ces\`aro mean estimator of the conditional density given an infinite past, which solves the problem of universal prediction with the loss for a countable alphabet, by the way. We also show that there exists a strongly consistent entropy rate estimator with respect to the Lebesgue measure in the class of stationary ergodic Gaussian processes.
View on arXiv