414
v1v2 (latest)

Information-Theoretic Bounds on the Moments of the Generalization Error of Learning Algorithms

International Symposium on Information Theory (ISIT), 2021
Abstract

Generalization error bounds are critical to understanding the performance of machine learning models. In this work, building upon a new bound of the expected value of an arbitrary function of the population and empirical risk of a learning algorithm, we offer a more refined analysis of the generalization behaviour of a machine learning models based on a characterization of (bounds) to their generalization error moments. We discuss how the proposed bounds -- which also encompass new bounds to the expected generalization error -- relate to existing bounds in the literature. We also discuss how the proposed generalization error moment bounds can be used to construct new generalization error high-probability bounds.

View on arXiv
Comments on this paper