We consider a sparse linear regression model, when the number of available predictors is much bigger than the sample size and the number of non-zero coefficients is small. To choose the regression model in this situation, we cannot use classical model selection criteria. In recent years, special methods have been proposed to deal with this type of problem, for example modified versions of Bayesian Information Criterion, like mBIC or mBIC2. It was shown that these criteria are consistent under the assumption that both and as well as tend to infinity and the error term is normally distributed. In this article we prove the consistency of mBIC and mBIC2 with the assumption that the error term is a subgaussian random variable.
View on arXiv