ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.11665
  4. Cited By
Profile Entropy: A Fundamental Measure for the Learnability and
  Compressibility of Discrete Distributions

Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Discrete Distributions

26 February 2020
Yi Hao
A. Orlitsky
ArXiv (abs)PDFHTML

Papers citing "Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Discrete Distributions"

2 / 2 papers shown
Title
Instance Based Approximations to Profile Maximum Likelihood
Instance Based Approximations to Profile Maximum Likelihood
Nima Anari
Moses Charikar
Kirankumar Shiragur
Aaron Sidford
47
9
0
05 Nov 2020
The Bethe and Sinkhorn Permanents of Low Rank Matrices and Implications
  for Profile Maximum Likelihood
The Bethe and Sinkhorn Permanents of Low Rank Matrices and Implications for Profile Maximum Likelihood
Nima Anari
Moses Charikar
Kirankumar Shiragur
Aaron Sidford
35
12
0
06 Apr 2020
1