ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.10592
160
0
v1v2 (latest)

SuperMix: Sparse Regularization for Mixtures

23 July 2019
Yohann De Castro
S. Gadat
C. Marteau
Cathy Maugis
ArXiv (abs)PDFHTML
Abstract

This paper investigates the statistical estimation of a discrete mixing measure μ\muμ0 involved in a kernel mixture model. Using some recent advances in l1-regularization over the space of measures, we introduce a "data fitting and regularization" convex program for estimating μ\muμ0 in a grid-less manner from a sample of mixture law, this method is referred to as Beurling-LASSO. Our contribution is twofold: we derive a lower bound on the bandwidth of our data fitting term depending only on the support of μ\muμ0 and its so-called "minimum separation" to ensure quantitative support localization error bounds; and under a so-called "non-degenerate source condition" we derive a non-asymptotic support stability property. This latter shows that for a sufficiently large sample size n, our estimator has exactly as many weighted Dirac masses as the target μ\muμ0 , converging in amplitude and localization towards the true ones. Finally, we also introduce some tractable algorithms for solving this convex program based on "Sliding Frank-Wolfe" or "Conic Particle Gradient Descent". Statistical performances of this estimator are investigated designing a so-called "dual certificate", which is appropriate to our setting. Some classical situations, as e.g. mixtures of super-smooth distributions (e.g. Gaussian distributions) or ordinary-smooth distributions (e.g. Laplace distributions), are discussed at the end of the paper.

View on arXiv
Comments on this paper