Universal Consistency of Decision Trees for High Dimensional Additive
Models

This paper shows that decision trees constructed with Classification and Regression Trees (CART) methodology are universally consistent for additive models, even when the dimensionality scales exponentially with the sample size, under certain sparsity constraints. The consistency is universal in the sense that there are no a priori assumptions on the distribution of the input variables. Surprisingly, this adaptivity to (approximate or exact) sparsity is achieved with a single tree, as opposed to what might be expected for an ensemble. Finally, we show that these qualitative properties of individual trees are inherited by Breiman's random forests. A key step in the analysis is the establishment of an oracle inequality, which precisely characterizes the goodness-of-fit and complexity tradeoff.
View on arXiv