ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0802.1244
68
0

Learning Balanced Mixtures of Discrete Distributions with Small Sample

10 February 2008
Shuheng Zhou
ArXivPDFHTML
Abstract

We study the problem of partitioning a small sample of nnn individuals from a mixture of kkk product distributions over a Boolean cube {0,1}K\{0, 1\}^K{0,1}K according to their distributions. Each distribution is described by a vector of allele frequencies in RK\R^KRK. Given two distributions, we use γ\gammaγ to denote the average ℓ22\ell_2^2ℓ22​ distance in frequencies across KKK dimensions, which measures the statistical divergence between them. We study the case assuming that bits are independently distributed across KKK dimensions. This work demonstrates that, for a balanced input instance for k=2k = 2k=2, a certain graph-based optimization function returns the correct partition with high probability, where a weighted graph GGG is formed over nnn individuals, whose pairwise hamming distances between their corresponding bit vectors define the edge weights, so long as K=Ω(ln⁡n/γ)K = \Omega(\ln n/\gamma)K=Ω(lnn/γ) and Kn=Ω~(ln⁡n/γ2)Kn = \tilde\Omega(\ln n/\gamma^2)Kn=Ω~(lnn/γ2). The function computes a maximum-weight balanced cut of GGG, where the weight of a cut is the sum of the weights across all edges in the cut. This result demonstrates a nice property in the high-dimensional feature space: one can trade off the number of features that are required with the size of the sample to accomplish certain tasks like clustering.

View on arXiv
Comments on this paper