×

A new approach to least-squares estimation, with applications. (English) Zbl 0625.62046

The regression model \(y=g(x)+\epsilon\) and least-squares estimation are studied in a general context. For an estimator of the unknown g to be statistically meaningful, it should at least be consistent in some sense. In the least-squares context, the most natural requirement is \(L^ 2\)- consistency.
In this paper the author shows that entropy conditions on a rescaled and truncated version of the class of regression functions \({\mathcal G}\) imply strong \(L^ 2\)-consistency of the least-squares estimator. A result from empirical process theory is used to prove this. He deals with a few examples, such as (non)linear regression and isotonic regression. Some nonparametric regression estimators can also be considered as least- squares estimators, or modifications thereof (for instance penalized least squares).
Reviewer: U.B.Paik

MSC:

62J02 General nonlinear regression
62J05 Linear regression; mixed models
62G05 Nonparametric estimation
60B10 Convergence of probability measures
Full Text: DOI