Characterizing the Sample Complexity of Large-Margin Learning With
Second-Order Statistics
Journal of machine learning research (JMLR), 2012
Abstract
We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L_2 regularization: We introduce the margin-adapted dimension, which is a simple function of the second order statistics of the data distribution, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the margin-adapted dimension of the data distribution. The upper bounds are universal, and the lower bounds hold for a rich family of sub-Gaussian distributions. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification.
View on arXivComments on this paper
