zbMATH — the first resource for mathematics

Multivariate locally weighted least squares regression. (English) Zbl 0821.62020
Summary: Nonparametric regression using locally weighted least squares was first discussed by Stone and by Cleveland. Recently, it was shown by J. Fan [ibid. 21, No. 1, 196-216 (1993; Zbl 0773.62029)] and by J. Fan and I. Gijbels [ibid. 20, No. 4, 2008-2036 (1992; Zbl 0765.62040)] that the local linear kernel-weighted least squares regression estimator has asymptotic properties making it superior, in certain senses, to the Nadaraya-Watson and Gasser-Müller kernel estimators.
In this paper we extend their results on asymptotic bias and variance to the case of multivariate predictor variables. We are able to derive the leading bias and variance terms for general multivariate kernel weights using weighted least squares matrix theory. This approach is especially convenient when analyzing the asymptotic conditional bias and variance of the estimator at points near the boundary of the support of the predictors. We also investigate the asymptotic properties of the multivariate local quadratic least squares regression estimator discussed by W. Cleveland and S. Devlin [J. Am. Stat. Asoc. 83, 596-610 (1988)] and, in the univariate case, higher-order polynomial fits and derivative estimation.

62G07 Density estimation
62H12 Estimation in multivariate analysis
62J02 General nonlinear regression
Full Text: DOI