57
14
v1v2 (latest)

A deterministic and computable Bernstein-von Mises theorem

Abstract

Bernstein-von Mises results (BvM) establish that the Laplace approximation is asymptotically correct in the large-data limit. However, these results are inappropriate for computational purposes since they only hold over most, and not all, datasets and involve hard-to-estimate constants. In this article, I present a new BvM theorem which bounds the Kullback-Leibler (KL) divergence between a fixed log-concave density f(θ)f\left(\boldsymbol{\theta}\right) and its Laplace approximation. The bound goes to 00 as the higher-derivatives of f(θ)f\left(\boldsymbol{\theta}\right) tend to 00 and f(θ)f\left(\boldsymbol{\theta}\right) becomes increasingly Gaussian. The classical BvM theorem in the IID large-data asymptote is recovered as a corollary. Critically, this theorem further suggests a number of computable approximations of the KL divergence with the most promising being: \[ KL\left(g_{LAP},f\right)\approx\frac{1}{2}\text{Var}_{\boldsymbol{\theta}\sim g\left(\boldsymbol{\theta}\right)}\left(\log\left[f\left(\boldsymbol{\theta}\right)\right]-\log\left[g_{LAP}\left(\boldsymbol{\theta}\right)\right]\right) \] An empirical investigation of these bounds in the logistic classification model reveals that these approximations are great surrogates for the KL divergence. This result, and future results of a similar nature, could provide a path towards rigorously controlling the error due to the Laplace approximation and more modern approximation methods.

View on arXiv
Comments on this paper