ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.17042
29
1
v1v2 (latest)

Convergence of Dirichlet Forms for MCMC Optimal Scaling with General Target Distributions on Large Graphs

31 October 2022
Ning Ning
ArXiv (abs)PDFHTML
Abstract

Markov chain Monte Carlo (MCMC) algorithms have played a significant role in statistics, physics, machine learning and others, and they are the only known general and efficient approach for some high-dimensional problems. The Metropolis-Hastings (MH) algorithm as the most classical MCMC algorithm, has had a great influence on the development and practice of science and engineering. The behavior of the MH algorithm in high-dimensional problems is typically investigated through a weak convergence result of diffusion processes. In this paper, we introduce Mosco convergence of Dirichlet forms in analyzing the MH algorithm on large graphs, whose target distribution is the Gibbs measure that includes any probability measure satisfying a Markov property. The abstract and powerful theory of Dirichlet forms allows us to work directly and naturally on the infinite-dimensional space, and our notion of Mosco convergence allows Dirichlet forms associated with the MH Markov chains to lie on changing Hilbert spaces. Through the optimal scaling problem, we demonstrate the impressive strengths of the Dirichlet form approach over the standard diffusion approach.

View on arXiv
Comments on this paper