##
**CART and best-ortho-basis: a connection.**
*(English)*
Zbl 0942.62044

Summary: We study what we call “dyadic CART” – a method of nonparametric regression which constructs a recursive partition by optimizing a complexity penalized sum of squares, where the optimization is over all recursive partitions arising from midpoint splits. We show that the method is adaptive to unknown degrees of anisotropic smoothness. Specifically, consider the anisotropic smoothness classes of S.M. Nikol’skij [Approximation of functions of several variables and imbedding theorems. (1975; Zbl 0307.46024)] consisting of bivariate functions \(f(x_1,x_2)\) whose finite difference of distance \(h\) in direction \(i\) is bounded in \(L^p\) norm by \(Ch^{\delta_i}\), \(i=1,2\). We show that dyadic CART, with an appropriate complexity penalty parameter \(\lambda\sim\sigma^2 \cdot\text{Const} \cdot \log(n)\), is within logarithmic terms of minimax over every anisotropic smoothness class \(0<C<\infty\), \(0<\delta_1\), \(\delta_2\leq 1\).

The proof shows that dyadic CART is identical to a certain adaptive best-ortho-basis algorithm based on the library of all anisotropic Haar bases. Then it applies empirical basis selection ideas of D.L. Donoho and I.M. Johnstone [C. R. Acad. Sci., Paris, Ser. I 319, No. 12, 1317-1322 (1994; Zbl 0819.94007)]. The basis empirically selected by dyadic CART is shown to be nearly as good as a basis ideally adapted to the underlying \(f\). The risk of estimation in an ideally adapted anisotropic Haar basis is shown to be comparable to the minimax risk over anisotropic smoothness classes.

Underlying the success of this argument is harmonic analysis of anisotropic smoothness classes. We show that, for each anisotropic smoothness class, there is an anisotropic Haar basis which is a best orthogonal basis for representing that smoothness class; the basis is optimal not just within the library of anisotropic Haar bases, but among all orthogonal bases of \(L^2[0, 1]^2\).

The proof shows that dyadic CART is identical to a certain adaptive best-ortho-basis algorithm based on the library of all anisotropic Haar bases. Then it applies empirical basis selection ideas of D.L. Donoho and I.M. Johnstone [C. R. Acad. Sci., Paris, Ser. I 319, No. 12, 1317-1322 (1994; Zbl 0819.94007)]. The basis empirically selected by dyadic CART is shown to be nearly as good as a basis ideally adapted to the underlying \(f\). The risk of estimation in an ideally adapted anisotropic Haar basis is shown to be comparable to the minimax risk over anisotropic smoothness classes.

Underlying the success of this argument is harmonic analysis of anisotropic smoothness classes. We show that, for each anisotropic smoothness class, there is an anisotropic Haar basis which is a best orthogonal basis for representing that smoothness class; the basis is optimal not just within the library of anisotropic Haar bases, but among all orthogonal bases of \(L^2[0, 1]^2\).

### MSC:

62G08 | Nonparametric regression and quantile regression |

41A30 | Approximation by other special function classes |

42C40 | Nontrigonometric harmonic analysis involving wavelets and other special systems |

62G20 | Asymptotic properties of nonparametric inference |

41A25 | Rate of convergence, degree of approximation |