v1v2 (latest)
Computationally Efficient Feature Significance and Importance for Machine Learning Models
- FAtt

Abstract
We develop a simple and computationally efficient significance test for the features of a machine learning model. Our forward-selection approach applies to any model specification, learning task and variable type. The test is non-asymptotic, straightforward to implement, and does not require model refitting. It identifies the statistically significant features as well as feature interactions of any order in a hierarchical manner, and generates a model-free notion of feature importance. Experimental and empirical results illustrate its performance.
View on arXivComments on this paper