216

Learning Balanced Mixtures of Discrete Distributions with Small Sample

Abstract

We study the problem of partitioning a small sample of nn individuals from a mixture of kk product distributions over a Boolean cube {0,1}K\{0, 1\}^K according to their distributions. Each distribution is described by a vector of allele frequencies in RK\R^K. Given two distributions, we use γ\gamma to denote the average 22\ell_2^2 distance in frequencies across KK dimensions, which measures the statistical divergence between them. We study the case assuming that bits are independently distributed across KK dimensions. This work demonstrates that, for a balanced input instance for k=2k = 2, a certain graph-based optimization function returns the correct partition with high probability, where a weighted graph GG is formed over nn individuals, whose pairwise hamming distances between their corresponding bit vectors define the edge weights, so long as K=Ω(lnn/γ)K = \Omega(\ln n/\gamma) and Kn=Ω~(lnn/γ2)Kn = \tilde\Omega(\ln n/\gamma^2). The function computes a maximum-weight balanced cut of GG, where the weight of a cut is the sum of the weights across all edges in the cut. This result demonstrates a nice property in the high-dimensional feature space: one can trade off the number of features that are required with the size of the sample to accomplish certain tasks like clustering.

View on arXiv
Comments on this paper