×

Least squares estimation with complexity penalties. (English) Zbl 1005.62043

Summary: We examine the regression model \(Y_i=g_0(z_i)+W_i\), \(i=1,\dots,n\), and the penalized least squares estimator \[ \widehat g_n=\arg \min_{g\in{\mathcal G}}\bigl \{\|Y-g\|^2+ \text{pen}^2(g)\bigr\}, \] where \(\text{pen} (g)\) is a penalty on the complexity of the function \(g\). We show that a rate of convergence for \(\widehat g_n\) is determined by the entropy of the sets \[ {\mathcal G}_*(\delta) =\bigl\{g\in{\mathcal G}: \|g-g_*\|^2+ \text{pen}^2 (g)\leq\delta^2 \bigr\},\;\delta>0, \] where \(g_*=\arg \min_{g\in {\mathcal G}}\{\|g-g_0 \|^2+\text{pen}^2(g)\}\) (say). As examples, we consider Sobolov and dimension penalties.

MSC:

62G08 Nonparametric regression and quantile regression
PDF BibTeX XML Cite