zbMATH — the first resource for mathematics

Iterative bias reduction: a comparative study. (English) Zbl 1322.62131
Summary: Multivariate nonparametric smoothers, such as kernel based smoothers and thin plate splines smoothers, are adversely impacted by the sparseness of data in high dimension, also known as the curse of dimensionality. Adaptive smoothers, that can exploit the underlying smoothness of the regression function, may partially mitigate this effect. This paper presents a comparative simulation study of a novel adaptive smoother (IBR) with competing multivariate smoothers available as package or function within the R language and environment for statistical computing. Comparison between the methods are made on simulated datasets of moderate size, from 50 to 200 observations, with two, five or 10 potential explanatory variables, and on a real dataset. The results show that the good asymptotic properties of IBR are complemented by a very good behavior on moderate sized datasets, results which are similar to those obtained with Duchon low rank splines.

62G08 Nonparametric regression and quantile regression
ibr; gss; R
Full Text: DOI
[1] Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: Petrov, B.N., Csaki, B.F. (eds.) Second International Symposium on Information Theory, pp. 267–281. Academiai Kiado, Budapest (1973) · Zbl 0283.62006
[2] Breiman, L.: Bagging predictors. Mach. Learn. 24, 123–140 (1996) · Zbl 0858.68080
[3] Breiman, L.: Using adaptive bagging to Debais regressions. Tech. Rep. 547, Department of Statistics, UC Berkeley (1999)
[4] Breiman, L., Freiman, J., Olshen, R., Stone, C.: Classification and Regression Trees, 4th edn. CRC Press, Boca Raton (1984) · Zbl 0541.62042
[5] Bühlmann, P., Yu, B.: Boosting with the l 2 loss: regression and classification. J. Am. Stat. Assoc. 98, 324–339 (2003) · Zbl 1041.62029
[6] Bühlmann, P., Yu, B.: Sparse boosting. J. Mach. Learn. Res. 7, 1001–1024 (2006) · Zbl 1222.68155
[7] Buja, A., Hastie, T., Tibshirani, R.: Linear smoothers and additive models. Ann. Stat. 17, 453–510 (1989) · Zbl 0689.62029
[8] Cornillon, P.A., Hengartner, N., Matzner-Løber, E.: Iterative bias reduction multivariate smoothing in R: the IBR package (2011a). arXiv:1105.3605v1
[9] Cornillon, P.A., Hengartner, N., Matzner-Løber, E.: Recursive bias estimation for multivariate regression (2011b). arXiv:1105.3430v2 · Zbl 1305.62162
[10] Craven, P., Wahba, G.: Smoothing noisy data with spline functions: estimating the correct degree of smoothing by the method of generalized cross-validation. Numer. Math. 31, 377–403 (1979) · Zbl 0377.65007
[11] Di Marzio, M., Taylor, C.: On boosting kernel regression. J. Stat. Plan. Inference 138, 2483–2498 (2008) · Zbl 1182.62091
[12] Duchon, J.: Splines minimizing rotation-invariant semi-norms in Sobolev spaces. In: Shemp, W., Zeller, K. (eds.) Construction Theory of Functions of Several Variables, pp. 85–100. Springer, Berlin (1977)
[13] Eubank, R.: Spline Smoothing and Nonparametric Regression. Marcel Dekker, New York (1988) · Zbl 0702.62036
[14] Fan, J., Gijbels, I.: Local Polynomial Modeling and Its Application, Theory and Methodologies. Chapman & Hall, New York (1996) · Zbl 0873.62037
[15] Friedman, J.: Multivariate adaptive regression splines. Ann. Stat. 19, 337–407 (1991) · Zbl 0765.62064
[16] Friedman, J.: Greedy function approximation: A gradient boosting machine. Ann. Stat. 28, 1189–1232 (2001) · Zbl 1043.62034
[17] Friedman, J., Stuetzle, W.: Projection pursuit regression. J. Am. Stat. Assoc. 76, 817–823 (1981)
[18] Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Ann. Stat. 28, 337–407 (2000) · Zbl 1106.62323
[19] Gu, C.: Smoothing Spline ANOVA Models. Springer, Berlin (2002) · Zbl 1051.62034
[20] Gyorfi, L., Kohler, M., Krzyzak, A., Walk, H.: A Distribution-Free Theory of Nonparametric Regression. Springer, Berlin (2002)
[21] Hastie, T.J., Tibshirani, R.J.: Generalized Additive Models. Chapman & Hall, New York (1995) · Zbl 0747.62061
[22] Hurvich, C., Simonoff, G., Tsai, C.L.: Smoothing parameter selection in nonparametric regression using and improved Akaike information criterion. J. R. Stat. Soc. B 60, 271–294 (1998) · Zbl 0909.62039
[23] Lepski, O.: Asymptotically minimax adaptive estimation. I: Upper bounds. Opitmally adaptive estimates. Theory Probab. Appl. 37, 682–697 (1991) · Zbl 0776.62039
[24] Li, K.C.: Asymptotic optimality for C p , C L , cross-validation and generalized cross-validation: discrete index set. Ann. Stat. 15, 958–975 (1987) · Zbl 0653.62037
[25] Ridgeway, G.: Additive logistic regression: a statistical view of boosting: discussion. Ann. Stat. 28, 393–400 (2000)
[26] Schwarz, G.: Estimating the dimension of a model. Ann. Stat. 6, 461–464 (1978) · Zbl 0379.62005
[27] Simonoff, J.S.: Smoothing Methods in Statistics. Springer, New York (1996) · Zbl 0859.62035
[28] Tsybakov, A.: Introduction to Nonparametric Estimation. Springer, Berlin (2009) · Zbl 1176.62032
[29] Tukey, J.W.: Explanatory Data Analysis. Addison-Wesley, Reading (1977) · Zbl 0409.62003
[30] Wood, S.N.: Thin plate regression splines. J. R. Stat. Soc. B 65, 95–114 (2003) · Zbl 1063.62059
[31] Wood, S.N.: Stable and efficient multiple smoothing parameter estimation for generalized additive models. J. Am. Stat. Assoc. 99, 673–686 (2004) · Zbl 1117.62445
[32] Yang, Y.: Combining different procedures for adaptive regression. J. Multivar. Anal. 74, 135–161 (2000) · Zbl 0964.62032
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.