##
**Dimension reduction in a semiparametric regression model with errors in covariates.**
*(English)*
Zbl 0822.62022

Summary: We consider a semiparametric estimation method for general regression models when some of the predictors are measured with error. The technique relies on a kernel regression of the “true” covariate on all the observed covariates and surrogates. This requires a nonparametric regression in as many dimensions as there are covariates and surrogates. The usual theory copes with such higher-dimensional problems by using higher-order kernels, but this is unrealistic for most problems. We show that the usual theory is essentially as good as one can do with this technique.

Instead of regression with higher-order kernels, we propose the use of dimension reduction techniques. We assume that the “true” covariate depends only on a linear combination of the observed covariates and surrogates. If this linear combination were known, we could apply the one-dimensional versions of the semiparametric problem, for which standard kernels are applicable. We show that if one can estimate the linear directions at the root-\(n\) rate, then asymptotically the resulting estimator of the parameters in the main regression model behaves as if the linear combination were known. Simulations lend some credence to the asymptotic results.

Instead of regression with higher-order kernels, we propose the use of dimension reduction techniques. We assume that the “true” covariate depends only on a linear combination of the observed covariates and surrogates. If this linear combination were known, we could apply the one-dimensional versions of the semiparametric problem, for which standard kernels are applicable. We show that if one can estimate the linear directions at the root-\(n\) rate, then asymptotically the resulting estimator of the parameters in the main regression model behaves as if the linear combination were known. Simulations lend some credence to the asymptotic results.