ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.11665
  4. Cited By
Profile Entropy: A Fundamental Measure for the Learnability and
  Compressibility of Discrete Distributions

Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Discrete Distributions

26 February 2020
Yi Hao
A. Orlitsky
ArXiv (abs)PDFHTML

Papers citing "Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Discrete Distributions"

4 / 4 papers shown
Instance Based Approximations to Profile Maximum Likelihood
Instance Based Approximations to Profile Maximum Likelihood
Nima Anari
Moses Charikar
Kirankumar Shiragur
Aaron Sidford
219
9
0
05 Nov 2020
On the High Accuracy Limitation of Adaptive Property Estimation
On the High Accuracy Limitation of Adaptive Property EstimationInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2020
Yanjun Han
417
5
0
27 Aug 2020
On the Competitive Analysis and High Accuracy Optimality of Profile
  Maximum Likelihood
On the Competitive Analysis and High Accuracy Optimality of Profile Maximum Likelihood
Yanjun Han
Kirankumar Shiragur
309
2
0
07 Apr 2020
The Bethe and Sinkhorn Permanents of Low Rank Matrices and Implications
  for Profile Maximum Likelihood
The Bethe and Sinkhorn Permanents of Low Rank Matrices and Implications for Profile Maximum LikelihoodAnnual Conference Computational Learning Theory (COLT), 2020
Nima Anari
Moses Charikar
Kirankumar Shiragur
Aaron Sidford
196
12
0
06 Apr 2020
1
Page 1 of 1