16
2

Testing Closeness of Multivariate Distributions via Ramsey Theory

Abstract

We investigate the statistical task of closeness (or equivalence) testing for multidimensional distributions. Specifically, given sample access to two unknown distributions p,q\mathbf p, \mathbf q on Rd\mathbb R^d, we want to distinguish between the case that p=q\mathbf p=\mathbf q versus pqAk>ϵ\|\mathbf p-\mathbf q\|_{A_k} > \epsilon, where pqAk\|\mathbf p-\mathbf q\|_{A_k} denotes the generalized Ak{A}_k distance between p\mathbf p and q\mathbf q -- measuring the maximum discrepancy between the distributions over any collection of kk disjoint, axis-aligned rectangles. Our main result is the first closeness tester for this problem with {\em sub-learning} sample complexity in any fixed dimension and a nearly-matching sample complexity lower bound. In more detail, we provide a computationally efficient closeness tester with sample complexity O((k6/7/polyd(ϵ))logd(k))O\left((k^{6/7}/ \mathrm{poly}_d(\epsilon)) \log^d(k)\right). On the lower bound side, we establish a qualitatively matching sample complexity lower bound of Ω(k6/7/poly(ϵ))\Omega(k^{6/7}/\mathrm{poly}(\epsilon)), even for d=2d=2. These sample complexity bounds are surprising because the sample complexity of the problem in the univariate setting is Θ(k4/5/poly(ϵ))\Theta(k^{4/5}/\mathrm{poly}(\epsilon)). This has the interesting consequence that the jump from one to two dimensions leads to a substantial increase in sample complexity, while increases beyond that do not. As a corollary of our general AkA_k tester, we obtain dTVd_{\mathrm TV}-closeness testers for pairs of kk-histograms on Rd\mathbb R^d over a common unknown partition, and pairs of uniform distributions supported on the union of kk unknown disjoint axis-aligned rectangles. Both our algorithm and our lower bound make essential use of tools from Ramsey theory.

View on arXiv
Comments on this paper