Hamiltonian Monte Carlo Acceleration Using Neural Network Surrogate functions

Relatively high computational cost for Bayesian methods often limits their application for big data analysis. In recent years, there have been many attempts to improve computational efficiency of Bayesian inference. Here we propose an efficient and scalable computational technique for a state-of-the-art Markov Chain Monte Carlo (MCMC) methods, namely, Hamiltonian Monte Carlo (HMC). The key idea is to explore and exploit the regularity in parameter space for the underlying probabilistic model to construct an effective approximation of the collective geometric and statistical properties of the whole observed data. To this end, we use shallow neural networks along with efficient learning algorithms. The choice of basis functions (or hidden units in neural networks) and the optimized learning process provides a flexible, scalable and efficient sampling algorithm. Experiments based on simulated and real data show that our approach leads to substantially more efficient sampling algorithms compared to existing state-of-the art methods.
View on arXiv