481
v1v2 (latest)

Sample Complexity of Probability Divergences under Group Symmetry

International Conference on Machine Learning (ICML), 2023
Abstract

We rigorously quantify the improvement in the sample complexity of variational divergence estimations for group-invariant distributions. In the cases of the Wasserstein-1 metric and the Lipschitz-regularized α\alpha-divergences, the reduction of sample complexity is proportional to an ambient-dimension-dependent power of the group size. For the maximum mean discrepancy (MMD), the improvement of sample complexity is more nuanced, as it depends on not only the group size but also the choice of kernel. Numerical simulations verify our theories.

View on arXiv
Comments on this paper