ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.06735
6
3

Error-guided likelihood-free MCMC

13 October 2020
Volodimir Begy
Erich Schikuta
ArXivPDFHTML
Abstract

This work presents a novel posterior inference method for models with intractable evidence and likelihood functions. Error-guided likelihood-free MCMC, or EG-LF-MCMC in short, has been developed for scientific applications, where a researcher is interested in obtaining approximate posterior densities over model parameters, while avoiding the need for expensive training of component estimators on full observational data or the tedious design of expressive summary statistics, as in related approaches. Our technique is based on two phases. In the first phase, we draw samples from the prior, simulate respective observations and record their errors ϵ\epsilonϵ in relation to the true observation. We train a classifier to distinguish between corresponding and non-corresponding (ϵ,θ)(\epsilon, \boldsymbol{\theta})(ϵ,θ)-tuples. In the second stage the said classifier is conditioned on the smallest recorded ϵ\epsilonϵ value from the training set and employed for the calculation of transition probabilities in a Markov Chain Monte Carlo sampling procedure. By conditioning the MCMC on specific ϵ\epsilonϵ values, our method may also be used in an amortized fashion to infer posterior densities for observations, which are located a given distance away from the observed data. We evaluate the proposed method on benchmark problems with semantically and structurally different data and compare its performance against the state of the art approximate Bayesian computation (ABC).

View on arXiv
Comments on this paper