Pareto Smoothed Importance Sampling
Journal of machine learning research (JMLR), 2015
Abstract
Importance weighting is a convenient general way to adjust for draws from the wrong distribution, but the resulting ratio estimate can be noisy when the importance weights have a heavy right tail, as routinely occurs when there are aspects of the target distribution not well captured by the approximating distribution. More stable estimates can be obtained by truncating the importance ratios. Here we present a new method for stabilizing importance weights using generalized Pareto distribution fit to the upper tail of the distribution of the simulated importance ratios.
View on arXivComments on this paper
