Adaptive sequential Monte Carlo by means of mixture of experts
Selecting appropriately the proposal kernel of particle filters is an issue of significant importance, since a bad choice may lead to deterioration of the particle sample and, consequently, waste of computational power. In this paper we introduce a novel algorithm approximating adaptively the so-called optimal proposal kernel by a mixture of integrated curved exponential distributions with logistic weights. This family of distributions is broad enough to be used in the presence of multi-modality or strongly skewed distributions. This "mixture of experts" is fitted, via Monte Carlo EM or online-EM methods, to the optimal kernel through minimization of the Kullback-Leibler divergence between the auxiliary target and instrumental distributions of the particle filter. The algorithm requires only one optimization problem to be solved for the whole sample, as opposed to existing methods solving one problem per particle. In addition, we illustrate in a simulation study how the method can be successfully applied to optimal filtering in nonlinear state-space models.
View on arXiv