128

Kullback-Leibler Divergence for the Normal-Gamma Distribution

Abstract

We derive the Kullback-Leibler divergence for the normal-gamma distribution and show that it is identical to the Bayesian complexity penalty for the univariate general linear model with conjugate priors. Based on this finding, we provide two applications of the KL divergence, one in simulated and one in empirical data.

View on arXiv
Comments on this paper