Proper Scoring Rules and Bregman Divergences

We present a new perspective on proper scoring rules (PSRs) in which they are naturally derived from a more general convex construction that also includes functional Bregman divergences. Our motivation lies in the fact that sets of probability distributions are negligible, mathematically, and functions defined exclusively on such sets do not have good analytical properties. We first examine the fact that all entropy functions may be extended as sublinear functions of denormalised probability densities and PSRs are characterised as their subgradients. We then proceed to explore general convex extensions of entropy functions and uncover the connection between functional Bregman divergences and PSRs. We examine and systematise previous characterisation results of PSRs and Bregman divergences in a unified theoretical framework.
View on arXiv