429

On the Opportunities and Pitfalls of Nesting Monte Carlo Estimators

Abstract

We present a formalization of nested Monte Carlo (NMC) estimation, whereby terms in an outer estimator are themselves the output of separate, nested, Monte Carlo (MC) estimators. We demonstrate that NMC can provide consistent estimates of nested expectations, including cases of repeated nesting, under mild conditions; establish corresponding rates of convergence; and provide empirical evidence that suggests these rates are observed in practice. We further establish a number of pitfalls that can arise from naive nesting of MC estimators and provide guidelines about how they can be avoided. Our results show that whenever an outer estimator depends nonlinearly on an inner estimator, then the number of samples used in both the inner and outer estimators must, in general, be driven to infinity for convergence. We also lay out novel methods for reformulating certain classes of nested expectation problems into a single expectation, leading to improved convergence rates compared with naive NMC. Finally, we derive a new estimator for use in discrete Bayesian experimental design problems which has a better convergence rate than existing methods.

View on arXiv
Comments on this paper