A Property of the Kullback--Leibler Divergence for Location-scale Models
Abstract
In this paper, we discuss a property of the Kullback--Leibler divergence measured between two models of the family of the location-scale distributions. We show that, if model and model are represented by location-scale distributions, then the minimum Kullback--Leibler divergence from to , with respect to the parameters of , is independent from the value of the parameters of . Furthermore, we show that the property holds for models that can be transformed into location-scale distributions. We illustrate a possible application of the property in objective Bayesian model selection.
View on arXivComments on this paper
