ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.08101
304
10
v1v2v3 (latest)

The Sparse Hausdorff Moment Problem, with Application to Topic Models

16 July 2020
Spencer Gordon
Bijan Mazaheri
Leonard J. Schulman
Y. Rabani
ArXiv (abs)PDFHTML
Abstract

We consider the problem of identifying, from its first mmm noisy moments, a probability distribution on [0,1][0,1][0,1] of support k<∞k<\inftyk<∞. This is equivalent to the problem of learning a distribution on mmm observable binary random variables X1,X2,…,XmX_1,X_2,\dots,X_mX1​,X2​,…,Xm​ that are iid conditional on a hidden random variable UUU taking values in {1,2,…,k}\{1,2,\dots,k\}{1,2,…,k}. Our focus is on accomplishing this with m=2km=2km=2k, which is the minimum mmm for which verifying that the source is a kkk-mixture is possible (even with exact statistics). This problem, so simply stated, is quite useful: e.g., by a known reduction, any algorithm for it lifts to an algorithm for learning pure topic models. In past work on this and also the more general mixture-of-products problem (XiX_iXi​ independent conditional on UUU, but not necessarily iid), a barrier at mO(k2)m^{O(k^2)}mO(k2) on the sample complexity and/or runtime of the algorithm was reached. We improve this substantially. We show it suffices to use a sample of size exp⁡(klog⁡k)\exp(k\log k)exp(klogk) (with m=2km=2km=2k). It is known that the sample complexity of any solution to the identification problem must be exp⁡(Ω(k))\exp(\Omega(k))exp(Ω(k)). Stated in terms of the moment problem, it suffices to know the moments to additive accuracy exp⁡(−klog⁡k)\exp(-k\log k)exp(−klogk). Our run-time for the moment problem is only O(k2+o(1))O(k^{2+o(1)})O(k2+o(1)) arithmetic operations.

View on arXiv
Comments on this paper