Optimal Sublinear Sampling of Spanning Trees and Determinantal Point Processes via Average-Case Entropic Independence

We design fast algorithms for repeatedly sampling from strongly Rayleigh distributions, which include random spanning tree distributions and determinantal point processes. For a graph , we show how to approximately sample uniformly random spanning trees from in time per sample after an initial time preprocessing. For a determinantal point process on subsets of size of a ground set of elements, we show how to approximately sample in time after an initial time preprocessing, where is the matrix multiplication exponent. We even improve the state of the art for obtaining a single sample from determinantal point processes, from the prior runtime of to . In our main technical result, we achieve the optimal limit on domain sparsification for strongly Rayleigh distributions. In domain sparsification, sampling from a distribution on is reduced to sampling from related distributions on for . We show that for strongly Rayleigh distributions, we can can achieve the optimal . Our reduction involves sampling from domain-sparsified distributions, all of which can be produced efficiently assuming convenient access to approximate overestimates for marginals of . Having access to marginals is analogous to having access to the mean and covariance of a continuous distribution, or knowing "isotropy" for the distribution, the key assumption behind the Kannan-Lov\ász-Simonovits (KLS) conjecture and optimal samplers based on it. We view our result as a moral analog of the KLS conjecture and its consequences for sampling, for discrete strongly Rayleigh measures.
View on arXiv