v1v2 (latest)
Improved Margin Generalization Bounds for Voting Classifiers
Annual Conference Computational Learning Theory (COLT), 2025
- AI4CE
Main:32 Pages
Bibliography:3 Pages
Abstract
In this paper we establish a new margin-based generalization bound for voting classifiers, refining existing results and yielding tighter generalization guarantees for widely used boosting algorithms such as AdaBoost (Freund and Schapire, 1997). Furthermore, the new margin-based generalization bound enables the derivation of an optimal weak-to-strong learner: a Majority-of-3 large-margin classifiers with an expected error matching the theoretical lower bound. This result provides a more natural alternative to the Majority-of-5 algorithm by (Høgsgaard et al., 2024), and matches the Majority-of-3 result by (Aden-Ali et al., 2024) for the realizable prediction model.
View on arXivComments on this paper
