Sharper Bounds for Chebyshev Moment Matching, with Applications
- FedML
We study the problem of approximately recovering a probability distribution given noisy measurements of its Chebyshev polynomial moments. This problem arises broadly across algorithms, statistics, and machine learning. By leveraging a global decay bound on the coefficients in the Chebyshev expansion of any Lipschitz function, we sharpen prior work, proving that accurate recovery in the Wasserstein distance is possible with more noise than previously known. Our result immediately yields a number of applications:1) We give a simple "linear query" algorithm for constructing a differentially private synthetic data distribution with Wasserstein- error based on a dataset of points in . This bound is optimal up to log factors, and matches a recent result of Boedihardjo, Strohmer, and Vershynin [Probab. Theory. Rel., 2024], which uses a more complex "superregular random walk" method.2) We give an time algorithm for the linear algebraic problem of estimating the spectral density of an symmetric matrix up to error in the Wasserstein distance. Our result accelerates prior methods from Chen et al. [ICML 2021] and Braverman et al. [STOC 2022].3) We tighten an analysis of Vinayak, Kong, Valiant, and Kakade [ICML 2019] on the maximum likelihood estimator for the statistical problem of "Learning Populations of Parameters'', extending the parameter regime in which sample optimal results can be obtained.Beyond these main results, we provide an extension of our bound to estimating distributions in dimensions. We hope that these bounds will find applications more broadly to problems involving distribution recovery from noisy moment information.
View on arXiv