*(English)*Zbl 1142.62044

Summary: *N. Meinshausen* and *P. Bühlmann* [Ann. Stat. 34, No. 3, 1436–1462 (2006; Zbl 1113.62082)] showed that for neighborhood selection in Gaussian graphical models, under a neighborhood stability condition the LASSO is consistent even when the number of variables is of greater order than the sample size. *P. Zhao* and *B. Yu* [J. Machine Learning Res. 7, 2541–2567 (2006)] formalized the neighborhood stability condition in the context of linear regression as a strong irrepresentable condition. That paper showed that under this condition, the LASSO selects exactly the set of nonzero regression coefficients, provided that these coefficients are bounded away from zero at a certain rate.

In this paper, the regression coefficients outside an ideal model are assumed to be small, but not necessarily zero. Under a sparse Riesz condition on the correlation of design variables, we prove that the LASSO selects a model of the correct order of dimensionality, controls the bias of the selected model at a level determined by the contributions of small regression coefficients and threshold bias, and selects all coefficients of greater order than the bias of the selected model. Moreover, as a consequence of this rate consistency of the LASSO in model selection, it is proved that the sum of error squares for the mean response and the ${\ell}_{\alpha}$-loss for the regression coefficients converge at the best possible rates under the given conditions. An interesting aspect of our results is that the logarithm of the number of variables can be of the same order as the sample size for certain random dependent designs.

##### MSC:

62J05 | Linear regression |

62H12 | Multivariate estimation |

62G08 | Nonparametric regression |

62J07 | Ridge regression; shrinkage estimators |