A distribution-free theory of nonparametric regression.

*(English)*Zbl 1021.62024
Springer Series in Statistics. New York, NY: Springer. xvi, 647 p. (2002).

This book presents a modern approach to nonparametric regression estimation with random design. It covers almost all known estimates, such as classical local averaging estimates including kernel, partitioning, and nearest neighbor estimates, least squares estimates using splines, neural networks and radial basis function networks, penalized least squares estimates, local polynomial kernel estimates, and orthogonal series estimates. The main topic of the investigation is to prove universal consistency, that is to show, that the estimates are consistent (with respect to the \(L_2\) error) for all distributions of the underlying data. Since it is impossible to derive such distribution-free results for the rate of convergence, smoothness classes of distributions are introduced and optimal minimax rates of convergence within these classes are derived. Furthermore, adaptive procedures achieving these rates are described.

This book is a self-contained text, intended for a wide audience, including graduate students in statistics, mathematics and computer sciences and researchers. Because of the clear mathematical presentation it can be used also for a course on nonparametric regression estimation. Starting off with elementary techniques in the first chapters the authors develop more difficult concepts including empirical process theory, martingales and approximation properties of neural networks. This makes the book to a valuable reference for anyone interested in nonparametric regression as well as to a source of many useful mathematical techniques.

This book is a self-contained text, intended for a wide audience, including graduate students in statistics, mathematics and computer sciences and researchers. Because of the clear mathematical presentation it can be used also for a course on nonparametric regression estimation. Starting off with elementary techniques in the first chapters the authors develop more difficult concepts including empirical process theory, martingales and approximation properties of neural networks. This makes the book to a valuable reference for anyone interested in nonparametric regression as well as to a source of many useful mathematical techniques.

Reviewer: H.Liero (Potsdam)

##### MSC:

62G08 | Nonparametric regression and quantile regression |

62G20 | Asymptotic properties of nonparametric inference |

62-01 | Introductory exposition (textbooks, tutorial papers, etc.) pertaining to statistics |

62-02 | Research exposition (monographs, survey articles) pertaining to statistics |