×

Estimating a smooth monotone regression function. (English) Zbl 0737.62038

The problem of estimating a smooth monotone regression function \(m\) is studied. Two estimators \(m_{SI}\) and \(m_{IS}\) are compared. \(m_{SI}\) consists of two steps: (i) smoothing of the data by the kernel estimator, (ii) isotonisation of the data by the pool adjacent violator algorithm. The estimator \(m_{IS}\) is constructed by interchanging these two steps.
The author considers the asymptotic behaviour of these estimators at a fixed point \(x_ 0\) where the function \(m\) is assumed to be strictly monotone and smooth. It is shown that if the bandwidth of the kernel estimator is chosen in the optimal order \(n^{-1/5}\), \(m_{SI}(x_ 0)\) and \(m_{IS}(x_ 0)\) are of order \(n^{-2/5}\) and that they are asymptotically equivalent in first order. But \(m_{SI}(x_ 0)- m_{IS}(x_ 0)\) is of the only slightly lower order \(n^{-8/15}.\)
Theorem 3 of the paper deals with stochastic higher order expansions for \(m_{SI}(x_ 0)\) and \(m_{IS}(x_ 0)\). These expansions entail that \(m_{IS}(x_ 0)\) has always a smaller variance and a larger bias than \(m_{SI}(x_ 0)\). Furthermore it is shown that the kernel function \(K\) of the chosen kernel estimator mainly determines whether one should prefer the estimator \(m_{SI}\) or \(m_{IS}\). If the bandwidth of the kernel estimator \(m_ S\) is chosen such that the mean square error is asymptotically minimized, then \(m_{IS}(x_ 0)\) has asymptotically smaller mean square error than \(m_{SI}(x_ 0)\) if and only if \[ \int K^ 2(t)dt [\int t^ 2K(t)dt\int K'(t)^ 2dt ]^{-1} \] is smaller than a universal constant.
For related literature see K. Cheng and P. Lin, Z. Wahrscheinlichkeitstheor. Verw. Geb. 57, 223-233 (1981; Zbl 0443.62029), and R. E. Barlow, D. J. Bartholomew, J. M. Bremner and H. D. Brunk, Statistical inference under order restrictions. The theory and application of isotonic regression (1972; Zbl 0246.62038).

MSC:

62G07 Density estimation
PDF BibTeX XML Cite
Full Text: DOI