ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.02264
11
7

De-Sequentialized Monte Carlo: a parallel-in-time particle smoother

4 February 2022
Adrien Corenflos
Nicolas Chopin
Simo Särkkä
ArXivPDFHTML
Abstract

Particle smoothers are SMC (Sequential Monte Carlo) algorithms designed to approximate the joint distribution of the states given observations from a state-space model. We propose dSMC (de-Sequentialized Monte Carlo), a new particle smoother that is able to process TTT observations in O(log⁡T)\mathcal{O}(\log T)O(logT) time on parallel architecture. This compares favourably with standard particle smoothers, the complexity of which is linear in TTT. We derive Lp\mathcal{L}_pLp​ convergence results for dSMC, with an explicit upper bound, polynomial in TTT. We then discuss how to reduce the variance of the smoothing estimates computed by dSMC by (i) designing good proposal distributions for sampling the particles at the initialization of the algorithm, as well as by (ii) using lazy resampling to increase the number of particles used in dSMC. Finally, we design a particle Gibbs sampler based on dSMC, which is able to perform parameter inference in a state-space model at a O(log⁡(T))\mathcal{O}(\log(T))O(log(T)) cost on parallel hardware.

View on arXiv
Comments on this paper