303

Relative Deviation Margin Bounds

Abstract

We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. We give two types of learning bounds, both data-dependent ones and bounds valid for general families, in terms of the Rademacher complexity or the empirical \ell_\infty covering number of the hypothesis set used. We also briefly highlight several applications of these bounds and discuss their connection with existing results.

View on arXiv
Comments on this paper