ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1506.05555
126
32
v1v2v3v4v5 (latest)

Hamiltonian Monte Carlo Acceleration Using Neural Network Surrogate functions

18 June 2015
Cheng Zhang
Babak Shahbaba
Hongkai Zhao
ArXiv (abs)PDFHTML
Abstract

Relatively high computational cost for Bayesian methods often limits their application for big data analysis. In recent years, there have been many attempts to improve computational efficiency of Bayesian inference. Here we propose an efficient and scalable computational technique for a state-of-the-art Markov Chain Monte Carlo (MCMC) methods, namely, Hamiltonian Monte Carlo (HMC). The key idea is to explore and exploit the regularity in parameter space for the underlying probabilistic model to construct an effective approximation of the collective geometric and statistical properties of the whole observed data. To this end, we use shallow neural networks along with efficient learning algorithms. The choice of basis functions (or hidden units in neural networks) and the optimized learning process provides a flexible, scalable and efficient sampling algorithm. Experiments based on simulated and real data show that our approach leads to substantially more efficient sampling algorithms compared to existing state-of-the art methods.

View on arXiv
Comments on this paper