ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.05073
104
19
v1v2v3v4v5v6 (latest)

Regularised Zero-Variance Control Variates

13 November 2018
Leah F. South
Chris J. Oates
Antonietta Mira
    BDL
ArXiv (abs)PDFHTML
Abstract

Zero-variance control variates (ZV-CV) is a post-processing method to reduce the variance of Monte Carlo estimators of expectations using the derivatives of the log target. Once the derivatives are available, the only additional computational effort is solving a linear regression problem. Significant variance reductions have been achieved with this method in low dimensional examples, but the number of covariates in the regression rapidly increases with the dimension of the parameters. We propose to exploit penalised regression to make the method more flexible and feasible, particularly in higher dimensions. A specific type of regularisation based on using subsets of derivatives, or a priori regularisation as we refer to it in this paper, is also proposed to reduce computational and storage requirements. The novel application of ZV-CV and regularised ZV-CV to sequential Monte Carlo (SMC) is described, where a new estimator for the normalising constant of the posterior is provided to aid Bayesian model choice. Several examples showing the utility and potential limitations of regularised ZV-CV for Bayesian inference are given. The methods proposed in this paper are easily accessible through the R package ZVCV available at https://github.com/LeahPrice/ZVCV .

View on arXiv
Comments on this paper