391

Adaptive importance sampling via minimization of estimators of cross-entropy, mean square, and inefficiency constant

Abstract

Inefficiency constant, equal to product of variance of an estimator and its mean computation cost, can be used for quantifying the inefficiency of using unbiased estimators in Monte Carlo (MC) procedures. In case when mean computation cost is the same for different estimators, comparing their inefficiency constants reduces to comparing their variances or equivalently mean squares. We investigate adaptive methods for obtaining parameters of importance sampling (IS) change of measure via minimization of well-known estimators of cross-entropy and mean square (of the IS estimator), as well as of new estimators of mean square and inefficiency constant. The received IS parameters can be used in separate IS MC procedures to estimate the quantity of interest. We develop single- and multi-stage minimization methods of such estimators for the families of IS distributions received from exponential change of measure and discrete Girsanov transformation up to a stopping time. We prove convergence and asymptotic properties of minimization results in our methods. We show that if the zero-variance IS parameter exists, then minimization results of the new estimators can converge to such parameter at a faster rate than such results of the well-known estimators, and a positive definite asymptotic covariance matrix of minimization results of the cross-entropy estimator is four times such matrix for the well-known mean square estimator. We introduce criteria for comparing asymptotic efficiency of stochastic optimization methods, applicable to methods minimizing estimators of functions considered in this work. In our numerical experiments for computing expectations of functionals of Euler scheme, minimization of the new estimators led to the lowest inefficiency constants and variances, followed by the well-known mean square estimator, and the cross-entropy one.

View on arXiv
Comments on this paper