ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.06285
85
2

Do PAC-Learners Learn the Marginal Distribution?

13 February 2023
Max Hopkins
D. Kane
Shachar Lovett
G. Mahajan
ArXivPDFHTML
Abstract

The Fundamental Theorem of PAC Learning asserts that learnability of a concept class HHH is equivalent to the uniform convergence\textit{uniform convergence}uniform convergence of empirical error in HHH to its mean, or equivalently, to the problem of density estimation\textit{density estimation}density estimation, learnability of the underlying marginal distribution with respect to events in HHH. This seminal equivalence relies strongly on PAC learning's `distribution-free' assumption, that the adversary may choose any marginal distribution over data. Unfortunately, the distribution-free model is known to be overly adversarial in practice, failing to predict the success of modern machine learning algorithms, but without the Fundamental Theorem our theoretical understanding of learning under distributional constraints remains highly limited.In this work, we revisit the connection between PAC learning, uniform convergence, and density estimation beyond the distribution-free setting when the adversary is restricted to choosing a marginal distribution from a known family P\mathscr{P}P. We prove that while the traditional Fundamental Theorem indeed fails, a finer-grained connection between the three fundamental notions continues to hold:1. PAC-Learning is strictly sandwiched between two refined models of density estimation, differing only in whether the learner knows\textit{knows}knows the set of well-estimated events in HHH.2. Under reasonable assumptions on HHH and P\mathscr{P}P, density estimation is equivalent to uniform estimation\textit{uniform estimation}uniform estimation, a relaxation of uniform convergence allowing non-empirical estimators.Together, our results give a clearer picture of how the Fundamental Theorem extends beyond the distribution-free setting and shed new light on the classically challenging problem of learning under arbitrary distributional assumptions.

View on arXiv
@article{hopkins2025_2302.06285,
  title={ Do PAC-Learners Learn the Marginal Distribution? },
  author={ Max Hopkins and Daniel M. Kane and Shachar Lovett and Gaurav Mahajan },
  journal={arXiv preprint arXiv:2302.06285},
  year={ 2025 }
}
Comments on this paper