416
v1v2 (latest)

NRGBoost: Energy-Based Generative Boosted Trees

International Conference on Learning Representations (ICLR), 2024
João Bravo
Main:10 Pages
7 Figures
Bibliography:3 Pages
14 Tables
Appendix:16 Pages
Abstract

Despite the rise to dominance of deep learning in unstructured data domains, tree-based methods such as Random Forests (RF) and Gradient Boosted Decision Trees (GBDT) are still the workhorses for handling discriminative tasks on tabular data. We explore generative extensions of these popular algorithms with a focus on explicitly modeling the data density (up to a normalization constant), thus enabling other applications besides sampling. As our main contribution we propose an energy-based generative boosting algorithm that is analogous to the second-order boosting implemented in popular libraries like XGBoost. We show that, despite producing a generative model capable of handling inference tasks over any input variable, our proposed algorithm can achieve similar discriminative performance to GBDT on a number of real world tabular datasets, outperforming alternative generative approaches. At the same time, we show that it is also competitive with neural-network-based models for sampling. Code is available atthis https URL.

View on arXiv
Comments on this paper