ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.00444
50
9
v1v2 (latest)

Minimax bounds for estimating multivariate Gaussian location mixtures

1 December 2020
Arlene K. H. Kim
Adityanand Guntuboyina
ArXiv (abs)PDFHTML
Abstract

We prove minimax bounds for estimating Gaussian location mixtures on Rd\mathbb{R}^dRd under the squared L2L^2L2 and the squared Hellinger loss functions. Under the squared L2L^2L2 loss, we prove that the minimax rate is upper and lower bounded by a constant multiple of n−1(log⁡n)d/2n^{-1}(\log n)^{d/2}n−1(logn)d/2. Under the squared Hellinger loss, we consider two subclasses based on the behavior of the tails of the mixing measure. When the mixing measure has a sub-Gaussian tail, the minimax rate under the squared Hellinger loss is bounded from below by (log⁡n)d/n(\log n)^{d}/n(logn)d/n. On the other hand, when the mixing measure is only assumed to have a bounded pthp^{\text{th}}pth moment for a fixed p>0p > 0p>0, the minimax rate under the squared Hellinger loss is bounded from below by n−p/(p+d)(log⁡n)−3d/2n^{-p/(p+d)}(\log n)^{-3d/2}n−p/(p+d)(logn)−3d/2. These rates are minimax optimal up to logarithmic factors.

View on arXiv
Comments on this paper