A Rectification-Based Approach for Distilling Boosted Trees into Decision Trees

Main:23 Pages
5 Figures
Bibliography:3 Pages
6 Tables
Appendix:3 Pages
Abstract
We present a new approach for distilling boosted trees into decision trees, in the objective of generating an ML model offering an acceptable compromise in terms of predictive performance and interpretability. We explain how the correction approach called rectification can be used to implement such a distillation process. We show empirically that this approach provides interesting results, in comparison with an approach to distillation achieved by retraining the model.
View on arXivComments on this paper
