Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
1307.1192
Cited By
AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods
4 July 2013
R. Freund
Paul Grigas
Rahul Mazumder
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods"
6 / 6 papers shown
Overview of AdaBoost : Reconciling its views to better understand its dynamics
Perceval Beja-Battais
148
22
0
06 Oct 2023
AdaSelection: Accelerating Deep Learning Training through Data Subsampling
Minghe Zhang
Chaosheng Dong
Jinmiao Fu
Tianchen Zhou
Jia Liang
...
Bo Liu
Michinari Momma
Bryan Wang
Yan Gao
Yi Sun
236
3
0
19 Jun 2023
A Precise High-Dimensional Asymptotic Theory for Boosting and Minimum-
ℓ
1
\ell_1
ℓ
1
-Norm Interpolated Classifiers
Social Science Research Network (SSRN), 2020
Tengyuan Liang
Pragya Sur
510
73
0
05 Feb 2020
Greedy algorithms for prediction
Alessio Sancetta
AI4TS
277
21
0
05 Feb 2016
A New Perspective on Boosting in Linear Regression via Subgradient Optimization and Relatives
R. Freund
Paul Grigas
Rahul Mazumder
196
45
0
16 May 2015
Explaining the Success of AdaBoost and Random Forests as Interpolating Classifiers
A. Wyner
Matthew A. Olson
J. Bleich
David Mease
361
305
0
28 Apr 2015
1
Page 1 of 1