A mathematical model of projection pursuit regression based on kernel estimation (of “marginal density of errors in projection direction”) is proposed and the necessary mathematical calculus for projection approximation of the target function is built up. The main results are explicit formulae for bias and error about the mean in orientation estimates and curve estimates. They show that the estimate of orientation (of projection) has most its error in the form of bias.
They also prove that the kernel-based projection pursuit regression does estimate corresponding projections with convergence rates identical to those known from one-dimensional estimation, namely (h being the bandwidth of kernel estimator). The estimator of G(x) based on projections to the direction (say is required to minimize
(where , , are data). The main idea how to construct is to do it through estimating by which minimizes
where is the nonparametric (kernel) estimate of G(x) based on all points except of . At the end, some alternative approaches (with randow window, two-stage algorithm etc.) are discussed.