ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.18542
8
0

End-to-end Feature Selection Approach for Learning Skinny Trees

28 October 2023
Shibal Ibrahim
Kayhan Behdin
Rahul Mazumder
ArXivPDFHTML
Abstract

We propose a new optimization-based approach for feature selection in tree ensembles, an important problem in statistics and machine learning. Popular tree ensemble toolkits e.g., Gradient Boosted Trees and Random Forests support feature selection post-training based on feature importance scores, while very popular, they are known to have drawbacks. We propose Skinny Trees: an end-to-end toolkit for feature selection in tree ensembles where we train a tree ensemble while controlling the number of selected features. Our optimization-based approach learns an ensemble of differentiable trees, and simultaneously performs feature selection using a grouped ℓ0\ell_0ℓ0​-regularizer. We use first-order methods for optimization and present convergence guarantees for our approach. We use a dense-to-sparse regularization scheduling scheme that can lead to more expressive and sparser tree ensembles. On 15 synthetic and real-world datasets, Skinny Trees can achieve 1.5 ⁣× ⁣− 620  ⁣× ⁣1.5\!\times\! -~620~\!\times\!1.5×− 620 × feature compression rates, leading up to 10×10\times10× faster inference over dense trees, without any loss in performance. Skinny Trees lead to superior feature selection than many existing toolkits e.g., in terms of AUC performance for 25\% feature budget, Skinny Trees outperforms LightGBM by 10.2%10.2\%10.2% (up to 37.7%37.7\%37.7%), and Random Forests by 3%3\%3% (up to 12.5%12.5\%12.5%).

View on arXiv
@article{ibrahim2025_2310.18542,
  title={ End-to-end Feature Selection Approach for Learning Skinny Trees },
  author={ Shibal Ibrahim and Kayhan Behdin and Rahul Mazumder },
  journal={arXiv preprint arXiv:2310.18542},
  year={ 2025 }
}
Comments on this paper