ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.04552
19
5
v1v2 (latest)

Fast compression of MCMC output

9 July 2021
Nicolas Chopin
Gabriel Ducrocq
ArXiv (abs)PDFHTML
Abstract

We propose cube thinning, a novel method for compressing the output of a MCMC (Markov chain Monte Carlo) algorithm when control variates are available. It amounts to resampling the initial MCMC sample (according to weights derived from control variates), while imposing equality constraints on averages of these control variates, using the cube method of [1]. Its main advantage is that its CPU cost is linear in N, the original sample size, and is constant in M, the required size for the compressed sample. This compares favourably to Stein thinning [2], which has complexity OpNM2q, and which requires the availability of the gradient of the target log-density (which automatically implies the availability of control variates). Our numerical experiments suggest that cube thinning is also competitive in terms of statistical error.

View on arXiv
Comments on this paper