ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.06991
31
6

Concentration of discrepancy-based approximate Bayesian computation via Rademacher complexity

14 June 2022
Sirio Legramanti
Daniele Durante
Pierre Alquier
ArXivPDFHTML
Abstract

There has been an increasing interest on summary-free versions of approximate Bayesian computation (ABC), which replace distances among summaries with discrepancies between the empirical distributions of the observed data and the synthetic samples generated under the proposed parameter values. The success of these solutions has motivated theoretical studies on the limiting properties of the induced posteriors. However, current results (i) are often tailored to a specific discrepancy, (ii) require, either explicitly or implicitly, regularity conditions on the data generating process and the assumed statistical model, and (iii) yield bounds depending on sequences of control functions that are not made explicit. As such, there is the lack of a theoretical framework that (i) is unified, (ii) facilitates the derivation of limiting properties that hold uniformly, and (iii) relies on verifiable assumptions that provide concentration bounds clarifying which factors govern the limiting behavior of the ABC posterior. We address this gap via a novel theoretical framework that introduces the concept of Rademacher complexity in the analysis of the limiting properties for discrepancy-based ABC posteriors. This yields a unified theory that relies on constructive arguments and provides more informative asymptotic results and uniform concentration bounds, even in settings not covered by current studies. These advancements are obtained by relating the properties of summary-free ABC posteriors to the behavior of the Rademacher complexity associated with the chosen discrepancy within the family of integral probability semimetrics. This family extends summary-based ABC, and includes the Wasserstein distance and maximum mean discrepancy (MMD), among others. As clarified through a focus on the MMD case and via illustrative simulations, this perspective yields an improved understanding of summary-free ABC.

View on arXiv
Comments on this paper