Pivotal Estimation of Nonparametric Functions via Square-root Lasso
We proposed a rescaled square-root lasso method for estimating nonparametric regression under heteroskedastic. An attractive feature of the approach is that it does not rely on the knowledge of the scale of the noise, unlike many other -regularized methods. In turn this translates into a robustness property that allows the same penalty level to be appropriate for a variety of design conditions. Our analysis is based on new identification conditions that allow for repeated regressors. We derive various non-asymptotic performance bounds for square-root lasso including prediction norm rate, -rate, -rate, and sharp sparsity bound. In order to cover heteroskedastic non-Gaussian noise, we rely on moderate deviation theory for self-normalized sums to achieve Gaussian-like results under weak conditions. Moreover, we derive bounds on the performance of ordinary least square (ols) applied to the model selected by square-root lasso accounting for possible misspecification of the selected model. Under mild conditions the rate of convergence of ols post square-root lasso is no worse than square-root lasso even with a misspecified selected model and possibly better otherwise. We show that the robustness properties of square-root lasso also extend to the parametric noiseless case and unbounded variance case. In the first case, square-root lasso recovers the true parameter value exactly, in sharp contrast to lasso. In the second case under symmetric disturbances, square-root lasso can be applied with similar penalty choices and still achieve near Gaussian rates in several cases, in contrast to lasso which would require a substantially larger penalty level.
View on arXiv