Park, B. U.; Lee, Y. K.; Ha, S. \(L_{2}\) boosting in kernel regression. (English) Zbl 1200.62040 Bernoulli 15, No. 3, 599-613 (2009). Summary: We investigate the theoretical and empirical properties of \(L_{2}\) boosting with kernel regression estimates as weak learners. We show that each step of \(L_{2}\) boosting reduces the bias of the estimate by two orders of magnitude, while it does not deteriorate the order of the variance. We illustrate the theoretical findings by some simulated examples. Also, we demonstrate that \(L_{2}\) boosting is superior to the use of higher-order kernels, which is a well-known method of reducing the bias of the kernel estimate. Cited in 3 Documents MSC: 62G08 Nonparametric regression and quantile regression 62G20 Asymptotic properties of nonparametric inference 65C60 Computational problems in statistics (MSC2010) Keywords:bias reduction; boosting; kernel regression; Nadaraya-Watson smoother; twicing × Cite Format Result Cite Review PDF Full Text: DOI arXiv References: [1] Bickel, P., Ritov, Y. and Zakai, A. (2006). Some theory for generalized boosting algorithms. J. Mach. Learn. Res. 7 705-732. · Zbl 1222.68148 [2] Breiman, L. (1998). Arcing classifier. Ann. Statist. 26 801-824. · Zbl 0934.62064 · doi:10.1214/aos/1024691079 [3] Breiman, L. (1999). Prediction games and arcing algorithms. Neural Computations 11 1493-1517. [4] Bühlmann, P. and Yu, B. (2003). Boosting with the L 2 loss: Regression and classification. J. Amer. Statist. Assoc. 98 324-339. · Zbl 1041.62029 · doi:10.1198/016214503000125 [5] Fan, J. and Gijbels, I. (1996). Local Polynomial Modelling and Its Applications . London: Chapman and Hall. · Zbl 0873.62037 [6] Freund, Y. (1995). Boosting a weak learning algorithm by majority. Info. Comp. 121 256-285. · Zbl 0833.68109 · doi:10.1006/inco.1995.1136 [7] Freund, Y. and Schapire, R.E. (1996). Experiments with a new boosting algorithm. In Machine Learning: Proceedings of the Thirteenth International Conference 148-156. [8] Freund, Y. and Schapire, R.E. (1997). A decision-theoretic generalization of online learning and application to boosting. J. Comput. System Sci. 55 119-139. · Zbl 0880.68103 · doi:10.1006/jcss.1997.1504 [9] Friedman, J.H. (2001). Greedy function approximation: a gradient boosting machine. Ann. Statist. 29 1189-1232. · Zbl 1043.62034 · doi:10.1214/aos/1013203451 [10] Friedman, J.H., Hastie, T and Tibshirani, R. (2000). Additive logistic regression: a statistical view of boosting. Ann. Statist. 28 337-407. · Zbl 1106.62323 · doi:10.1214/aos/1016218223 [11] Györfi, L., Kohler, M., Krzyzak, A. and Walk, H. (2002). A Distribution-Free Theory of Nonparametric Regression . New York: Springer. · Zbl 1021.62024 [12] Mason, L., Baxter, J., Bartlett, P. and Frean, M. (2000). Functional gradient techniques for combining hypotheses. In Advances in Large Margin Classifiers (A.J. Smola, P.J. Bartlett, B. Schölkopf and D. Schuurmans, eds.). Cambridge, MA: MIT Press. [13] Ruppert, D., Sheather, S.J. and Wand, M.P. (1995). An effective bandwidth selector for local least squares regression. J. Amer. Statist. Assoc. 90 1257-1270. · Zbl 0868.62034 · doi:10.2307/2291516 [14] Schapire, R.E. (1990). The strength of weak learnability. Mach. Learn. 5 197-227. [15] Schapire, R.E., Freund, Y., Bartlett, P. and Lee, W.S. (1998). Boosting the margin: A new explanation for the effectiveness of voting methods. Ann. Statist. 26 1651-1686. · Zbl 0929.62069 · doi:10.1214/aos/1024691352 [16] Schapire, R.E. and Singer, Y. (1999). Improved boosting algorithm using confidence-rated prediction. Mach. Learn. 37 297-336. · Zbl 0945.68194 · doi:10.1023/A:1007614523901 [17] Stützle, W. and Mittal, Y. (1979). Some comments on the asymptotic behavior of robust smoother. In Smoothing Techniques for Curve Estimation. Lecture Notes in Mathematics 757 191-195. Berlin: Springer. · Zbl 0421.62022 [18] Tukey, J.W. (1977). Exploratory Data Analysis . Reading, MA: Addison-Wesley. · Zbl 0409.62003 This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.