We propose new change of measure inequalities based on -divergences (of which the Kullback-Leibler divergence is a particular case). Our strategy relies on combining the Legendre transform of -divergences and the Young-Fenchel inequality. By exploiting these new change of measure inequalities, we derive new PAC-Bayesian generalisation bounds with a complexity involving -divergences, and holding in mostly unchartered settings (such as heavy-tailed losses). We instantiate our results for the most popular -divergences.
View on arXiv