ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1110.3907
77
13
v1v2v3 (latest)

AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem

18 October 2011
Peng Sun
Mark D. Reid
ArXiv (abs)PDFHTML
Abstract

LogitBoost and its later improvement, ABC-LogitBoost, are both successful multi-class boosting algorithms for classification. In this paper, we explicitly formulate the tree building at each LogitBoost iteration as constrained quadratic optimization. Both LogitBoost and ABC-LogtiBoost adopt approximated solver to such quadratic subproblem. We then propose an intuitively more natural solver, i.e. the block coordinate descent algorithm, and demonstrate that it leads to higher classification accuracy and faster convergence rate on a number of public datasets. This new LogitBoost behaves as if it combines many one-vs-one binary classifiers adaptively, hence the name AOSO-LogitBoost(Adaptive One-vs-One LogitBoost)

View on arXiv
Comments on this paper