187

Information Measures, Experiments, Multi-category Hypothesis Tests, and Surrogate Losses

Abstract

We provide a unifying view of statistical information measures, multi-class classification problems, multi-way Bayesian hypothesis testing, and loss functions, elaborating equivalence results between all of these objects. In particular, we consider a particular generalization of ff-divergences to multiple distributions, and we show that there is a constructive equivalence between ff-divergences, statistical information (in the sense of uncertainty as elaborated by DeGroot), and loss functions for multi-category classification. We also study an extension of our results to multi-class classification problems in which we must both infer a discriminant function γ\gamma and a data representation (or, in the setting of a hypothesis testing problem, an experimental design), represented by a quantizer q\mathsf{q} from a family of possible quantizers Q\mathsf{Q}. There, we give a complete characterization of the equivalence between loss functions, meaning that optimizing either of two losses yields the same optimal discriminant and quantizer q\mathsf{q}. A main consequence of our results is to describe those convex loss functions that are Fisher consistent for jointly choosing a data representation and minimizing the (weighted) probability of error in multi-category classification and hypothesis testing problems.

View on arXiv
Comments on this paper