ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.11571
  4. Cited By
AdaBoost is not an Optimal Weak to Strong Learner
v1v2 (latest)

AdaBoost is not an Optimal Weak to Strong Learner

International Conference on Machine Learning (ICML), 2023
27 January 2023
M. Hogsgaard
Kasper Green Larsen
Martin Ritzert
ArXiv (abs)PDFHTMLHuggingFace (1 upvotes)Github

Papers citing "AdaBoost is not an Optimal Weak to Strong Learner"

4 / 4 papers shown
Tight Margin-Based Generalization Bounds for Voting Classifiers over Finite Hypothesis Sets
Tight Margin-Based Generalization Bounds for Voting Classifiers over Finite Hypothesis Sets
Kasper Green Larsen
Natascha Schalburg
346
0
0
25 Nov 2025
Improved Margin Generalization Bounds for Voting Classifiers
Improved Margin Generalization Bounds for Voting ClassifiersAnnual Conference Computational Learning Theory (COLT), 2025
Mikael Møller Høgsgaard
Kasper Green Larsen
AI4CE
216
2
0
23 Feb 2025
Sample-Efficient Agnostic Boosting
Sample-Efficient Agnostic BoostingNeural Information Processing Systems (NeurIPS), 2024
Udaya Ghai
Karan Singh
174
2
0
31 Oct 2024
The Many Faces of Optimal Weak-to-Strong Learning
The Many Faces of Optimal Weak-to-Strong LearningNeural Information Processing Systems (NeurIPS), 2024
Mikael Møller Høgsgaard
Kasper Green Larsen
Markus Engelund Mathiasen
132
2
0
30 Aug 2024
1
Page 1 of 1