zbMATH — the first resource for mathematics

Testing predictor contributions in sufficient dimension reduction. (English) Zbl 1092.62046
Summary: We develop tests of the hypothesis of no effect for selected predictors in regression, without assuming a model for the conditional distribution of the response given the predictors. Predictor effects need not be limited to the mean function and smoothing is not required. The general approach is based on sufficient dimension reduction, the idea being to replace the predictor vector with a lower-dimensional version without loss of information on the regression. Methodology using sliced inverse regression is developed in detail.

62G08 Nonparametric regression and quantile regression
62H05 Characterization and structure theory for multivariate probability distributions; copulas
62E20 Asymptotic distribution theory in statistics
Full Text: DOI arXiv
[1] Bura, E. and Cook, R. D. (2001a). Extending sliced inverse regression: The weighted chi-squared test. J. Amer. Statist. Assoc. 96 996–1003. · Zbl 1047.62035
[2] Bura, E. and Cook, R. D. (2001b). Estimating the structural dimension of regressions via parametric inverse regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 63 393–410. · Zbl 0979.62041
[3] Chen, C.-H. and Li, K. C. (1998). Can SIR be as popular as multiple linear regression? Statist. Sinica 8 289–316. · Zbl 0897.62069
[4] Chiaromonte, F., Cook R. D. and Li, B. (2002). Sufficient dimension reduction in regressions with categorical predictors. Ann. Statist. 30 475–497. · Zbl 1012.62036
[5] Cook, R. D. (1994). On the interpretation of regression plots. J. Amer. Statist. Assoc. 89 177–189. · Zbl 0791.62066
[6] Cook, R. D. (1996). Graphics for regressions with a binary response. J. Amer. Statist. Assoc. 91 983–992. · Zbl 0882.62060
[7] Cook, R. D. (1998a). Regression Graphics . Wiley, New York. · Zbl 0903.62001
[8] Cook, R. D. (1998b). Principal Hessian directions revisited (with discussion). J. Amer. Statist. Assoc. 93 84–100. · Zbl 0922.62057
[9] Cook, R. D. and Critchley, F. (2000). Identifying regression outliers and mixtures graphically. J. Amer. Statist. Assoc. 95 781–794. · Zbl 0999.62056
[10] Cook, R. D. and Lee, H. (1999). Dimension reduction in binary response regression. J. Amer. Statist. Assoc. 94 1187–1200. · Zbl 1072.62619
[11] Cook, R. D. and Li, B. (2002). Dimension reduction for conditional mean in regression. Ann. Statist. 30 455–474. · Zbl 1012.62035
[12] Cook, R. D. and Nachtsheim, C. J. (1994). Reweighing to achieve elliptically contoured covariates in regression. J. Amer. Statist. Assoc. 89 592–599. · Zbl 0799.62078
[13] Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction,” by K.-C. Li. J. Amer. Statist. Assoc. 86 328–332. · Zbl 1353.62037
[14] Cook, R. D. and Weisberg, S. (1999a). Graphs in statistical analysis: Is the medium the message? Amer. Statist. 53 29–37.
[15] Cook, R. D. and Weisberg, S. (1999b). Applied Regression Including Computing and Graphics . Wiley, New York. · Zbl 0928.62045
[16] Eaton, M. L. and Tyler, D. E. (1994). The asymptotic distribution of singular values with applications to canonical correlations and correspondence analysis. J. Multivariate Anal. 50 238–264. · Zbl 0805.62020
[17] Field, C. (1993). Tail areas of linear combinations of chi-squares and non-central chi-squares. J. Statist. Comput. Simulation 45 243–248. · Zbl 0925.62062
[18] Gather, U., Hilker, T. and Becker, C. (2001). A robustified version of sliced inverse regression. In Statistics in Genetics and in the Environmental Sciences (L. T. Fernholz, S. Morganthaler and W. Stahel, eds.) 145–157. Birkhäuser, Basel.
[19] Hall, P. and Li, K. C. (1993). On almost linearity of low dimensional projections from high dimensional data. Ann. Statist. 21 867–889. JSTOR: · Zbl 0782.62065
[20] Hsing, T. and Carroll, R. J. (1992). An asymptotic theory for sliced inverse regression. Ann. Statist. 20 1040–1061. JSTOR: · Zbl 0821.62019
[21] Li, B., Cook, R. D. and Chiaromonte, F. (2003). Dimension reduction for the conditional mean in regressions with categorical predictors. Ann. Statist. 31 1636–1668. · Zbl 1042.62037
[22] Li, K. C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316–342. · Zbl 0742.62044
[23] Li, K. C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. J. Amer. Statist. Assoc. 87 1025–1039. · Zbl 0765.62003
[24] Li, K. C. (1997). Nonlinear confounding in high-dimensional regression. Ann. Statist. 25 577–612. · Zbl 0873.62071
[25] Muirhead, R. J. (1982). Aspects of Multivariate Statistical Theory . Wiley, New York. · Zbl 0556.62028
[26] Peters, B. C., Redner, R. and Decell, H. P. (1978). Characterizations of linear sufficient statistics. Sankhyā Ser. A 40 303–309. · Zbl 0422.62005
[27] Rao, C. R. (1965). Linear Statistical Inference and Its Applications . Wiley, New York. · Zbl 0137.36203
[28] Schott, J. (1994). Determining the dimensionality in sliced inverse regression. J. Amer. Statist. Assoc. 89 141–148. · Zbl 0791.62069
[29] Xia, Y., Tong, H., Li, W. K. and Zhu, L.-X. (2002). An adaptive estimation of dimension reduction space. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 363–410. · Zbl 1091.62028
[30] Zhu, L.-X. and Fang, K.-T. (1996). Asymptotics for kernel estimate of sliced inverse regression. Ann. Statist. 24 1053–1068. · Zbl 0864.62027
[31] Zhu, L.-X. and Ng, K. W. (1995). Asymptotics of sliced inverse regression. Statist. Sinica 5 727–736. · Zbl 0824.62036
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.