Training a neural netwok for data reduction and better generalization

At the time of environmental concerns about artificial intelligence, in particular its need for greedy storage and computation, sparsity inducing neural networks offer a promising path towards frugality and solution for less waste.Sparse learners compress the inputs (features) by selecting only the ones needed for good generalization. A human scientist can then give an intelligent interpretation to the few selected features. If genes are the inputs and cancer type is the output, then the selected genes give the cancerologist clues on what genes have an effect on certain cancers. LASSO-type regularization leads to good input selection for linear associations, but few attempts have been made for nonlinear associations modeled as an artificial neural network. A stringent but efficient way of testing whether a feature selection method works is to check if a phase transition occurs in the probability of retrieving the relevant features, as observed and mathematically studied for linear models. Our method achieves just so for artificial neural networks, and, on real data, it has the best compromise between number of selected features and generalization performance.Our method is flexible, applying to complex models ranging from shallow to deep artificial neural networks and supporting various cost functions and sparsity-promoting penalties. It does not rely on cross-validation or on a validation set to select its single regularization parameter making it user-friendly. Our approach can be seen as a form of compressed sensing for complex models, allowing to distill high-dimensional data into a compact, interpretable subset of meaningful features, just the opposite of a black box.A python package is available atthis https URLcontaining all the simulations and ready-to-use models.
View on arXiv@article{sardy2025_2411.17180, title={ Training a neural netwok for data reduction and better generalization }, author={ Sylvain Sardy and Maxime van Cutsem and Xiaoyu Ma }, journal={arXiv preprint arXiv:2411.17180}, year={ 2025 } }