van de Geer, Sara Least squares estimation with complexity penalties. (English) Zbl 1005.62043 Math. Methods Stat. 10, No. 3, 355-374 (2001). Summary: We examine the regression model \(Y_i=g_0(z_i)+W_i\), \(i=1,\dots,n\), and the penalized least squares estimator \[ \widehat g_n=\arg \min_{g\in{\mathcal G}}\bigl \{\|Y-g\|^2+ \text{pen}^2(g)\bigr\}, \] where \(\text{pen} (g)\) is a penalty on the complexity of the function \(g\). We show that a rate of convergence for \(\widehat g_n\) is determined by the entropy of the sets \[ {\mathcal G}_*(\delta) =\bigl\{g\in{\mathcal G}: \|g-g_*\|^2+ \text{pen}^2 (g)\leq\delta^2 \bigr\},\;\delta>0, \] where \(g_*=\arg \min_{g\in {\mathcal G}}\{\|g-g_0 \|^2+\text{pen}^2(g)\}\) (say). As examples, we consider Sobolov and dimension penalties. Cited in 15 Documents MSC: 62G08 Nonparametric regression and quantile regression Keywords:entropy; model selection; penalized least squares PDF BibTeX XML Cite \textit{S. van de Geer}, Math. Methods Stat. 10, No. 3, 355--374 (2001; Zbl 1005.62043) OpenURL