36
0

Nonparametric IPSS: Fast, flexible feature selection with false discovery control

Abstract

Feature selection is a critical task in machine learning and statistics. However, existing feature selection methods either (i) rely on parametric methods such as linear or generalized linear models, (ii) lack theoretical false discovery control, or (iii) identify few true positives. Here, we introduce a general feature selection method with finite-sample false discovery control based on applying integrated path stability selection (IPSS) to arbitrary feature importance scores. The method is nonparametric whenever the importance scores are nonparametric, and it estimates q-values, which are better suited to high-dimensional data than p-values. We focus on two special cases using importance scores from gradient boosting (IPSSGB) and random forests (IPSSRF). Extensive nonlinear simulations with RNA sequencing data show that both methods accurately control the false discovery rate and detect more true positives than existing methods. Both methods are also efficient, running in under 20 seconds when there are 500 samples and 5000 features. We apply IPSSGB and IPSSRF to detect microRNAs and genes related to cancer, finding that they yield better predictions with fewer features than existing approaches.

View on arXiv
@article{melikechi2025_2410.02208,
  title={ Nonparametric IPSS: Fast, flexible feature selection with false discovery control },
  author={ Omar Melikechi and David B. Dunson and Jeffrey W. Miller },
  journal={arXiv preprint arXiv:2410.02208},
  year={ 2025 }
}
Comments on this paper