ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.17701
19
0

Density estimation using the perceptron

29 December 2023
P. R. Gerber
Tianze Jiang
Yury Polyanskiy
Rui Sun
ArXivPDFHTML
Abstract

We propose a new density estimation algorithm. Given nnn i.i.d. samples from a distribution belonging to a class of densities on Rd\mathbb{R}^dRd, our estimator outputs any density in the class whose ''perceptron discrepancy'' with the empirical distribution is at most O(d/n)O(\sqrt{d/n})O(d/n​). The perceptron discrepancy between two distributions is defined as the largest difference in mass that they place on any halfspace of Rd\mathbb{R}^dRd. It is shown that this estimator achieves expected total variation distance to the truth that is almost minimax optimal over the class of densities with bounded Sobolev norm and Gaussian mixtures. This suggests that regularity of the prior distribution could be an explanation for the efficiency of the ubiquitous step in machine learning that replaces optimization over large function spaces with simpler parametric classes (e.g. in the discriminators of GANs). We generalize the above to show that replacing the ''perceptron discrepancy'' with the generalized energy distance of Sz\ékeley-Rizzo further improves total variation loss. The generalized energy distance between empirical distributions is easily computable and differentiable, thus making it especially useful for fitting generative models. To the best of our knowledge, it is the first example of a distance with such properties for which there are minimax statistical guarantees.

View on arXiv
Comments on this paper