87
36

Non Asymptotic Bounds for Vector Quantization

Abstract

Recent results in quantization theory show that the convergence rate for the mean-squared expected distortion of the empirical risk minimizer strategy, for any fixed probability distribution satisfying some regularity conditions is O(1/n), where n is the sample size. However, the dependency of the average distortion on other parameters is not known. This paper offers more general conditions, which may be thought of as margin conditions, under which a sharp upper bound on the expected distortion rate of the empirically optimal quantizer is derived. This upper bound is also proved to be sharp with respect to the dependency of the distortion on other natural parameters of the quantization issue.

View on arXiv
Comments on this paper