zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
Testing predictor contributions in sufficient dimension reduction. (English) Zbl 1092.62046
Summary: We develop tests of the hypothesis of no effect for selected predictors in regression, without assuming a model for the conditional distribution of the response given the predictors. Predictor effects need not be limited to the mean function and smoothing is not required. The general approach is based on sufficient dimension reduction, the idea being to replace the predictor vector with a lower-dimensional version without loss of information on the regression. Methodology using sliced inverse regression is developed in detail.

MSC:
62G08Nonparametric regression
62H05Characterization and structure theory (Multivariate analysis)
62E20Asymptotic distribution theory in statistics
WorldCat.org
Full Text: DOI arXiv
References:
[1] Bura, E. and Cook, R. D. (2001a). Extending sliced inverse regression: The weighted chi-squared test. J. Amer. Statist. Assoc. 96 996--1003. · Zbl 1047.62035 · doi:10.1198/016214501753208979
[2] Bura, E. and Cook, R. D. (2001b). Estimating the structural dimension of regressions via parametric inverse regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 63 393--410. · Zbl 0979.62041 · doi:10.1111/1467-9868.00292
[3] Chen, C.-H. and Li, K. C. (1998). Can SIR be as popular as multiple linear regression? Statist. Sinica 8 289--316. · Zbl 0897.62069
[4] Chiaromonte, F., Cook R. D. and Li, B. (2002). Sufficient dimension reduction in regressions with categorical predictors. Ann. Statist. 30 475--497. · Zbl 1012.62036 · doi:10.1214/aos/1021379862
[5] Cook, R. D. (1994). On the interpretation of regression plots. J. Amer. Statist. Assoc. 89 177--189. · Zbl 0791.62066 · doi:10.2307/2291214
[6] Cook, R. D. (1996). Graphics for regressions with a binary response. J. Amer. Statist. Assoc. 91 983--992. · Zbl 0882.62060 · doi:10.2307/2291717
[7] Cook, R. D. (1998a). Regression Graphics . Wiley, New York. · Zbl 0903.62001
[8] Cook, R. D. (1998b). Principal Hessian directions revisited (with discussion). J. Amer. Statist. Assoc. 93 84--100. · Zbl 0922.62057 · doi:10.2307/2669605
[9] Cook, R. D. and Critchley, F. (2000). Identifying regression outliers and mixtures graphically. J. Amer. Statist. Assoc. 95 781--794. · Zbl 0999.62056 · doi:10.2307/2669462
[10] Cook, R. D. and Lee, H. (1999). Dimension reduction in binary response regression. J. Amer. Statist. Assoc. 94 1187--1200. · Zbl 1072.62619 · doi:10.2307/2669934
[11] Cook, R. D. and Li, B. (2002). Dimension reduction for conditional mean in regression. Ann. Statist. 30 455--474. · Zbl 1012.62035 · doi:10.1214/aos/1021379861
[12] Cook, R. D. and Nachtsheim, C. J. (1994). Reweighing to achieve elliptically contoured covariates in regression. J. Amer. Statist. Assoc. 89 592--599. · Zbl 0799.62078 · doi:10.2307/2290862
[13] Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction,” by K.-C. Li. J. Amer. Statist. Assoc. 86 328--332.
[14] Cook, R. D. and Weisberg, S. (1999a). Graphs in statistical analysis: Is the medium the message? Amer. Statist. 53 29--37.
[15] Cook, R. D. and Weisberg, S. (1999b). Applied Regression Including Computing and Graphics . Wiley, New York. · Zbl 0928.62045
[16] Eaton, M. L. and Tyler, D. E. (1994). The asymptotic distribution of singular values with applications to canonical correlations and correspondence analysis. J. Multivariate Anal. 50 238--264. · Zbl 0805.62020 · doi:10.1006/jmva.1994.1041
[17] Field, C. (1993). Tail areas of linear combinations of chi-squares and non-central chi-squares. J. Statist. Comput. Simulation 45 243--248. · Zbl 0925.62062 · doi:10.1080/00949659308811484
[18] Gather, U., Hilker, T. and Becker, C. (2001). A robustified version of sliced inverse regression. In Statistics in Genetics and in the Environmental Sciences (L. T. Fernholz, S. Morganthaler and W. Stahel, eds.) 145--157. Birkhäuser, Basel.
[19] Hall, P. and Li, K. C. (1993). On almost linearity of low dimensional projections from high dimensional data. Ann. Statist. 21 867--889. JSTOR: · Zbl 0782.62065 · doi:10.1214/aos/1176349155 · http://links.jstor.org/sici?sici=0090-5364%28199306%2921%3A2%3C867%3AOALOLD%3E2.0.CO%3B2-0&origin=euclid
[20] Hsing, T. and Carroll, R. J. (1992). An asymptotic theory for sliced inverse regression. Ann. Statist. 20 1040--1061. JSTOR: · Zbl 0821.62019 · doi:10.1214/aos/1176348669 · http://links.jstor.org/sici?sici=0090-5364%28199206%2920%3A2%3C1040%3AAATFSI%3E2.0.CO%3B2-Z&origin=euclid
[21] Li, B., Cook, R. D. and Chiaromonte, F. (2003). Dimension reduction for the conditional mean in regressions with categorical predictors. Ann. Statist. 31 1636--1668. · Zbl 1042.62037 · doi:10.1214/aos/1065705121
[22] Li, K. C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316--342. · Zbl 0742.62044 · doi:10.2307/2290563
[23] Li, K. C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. J. Amer. Statist. Assoc. 87 1025--1039. · Zbl 0765.62003 · doi:10.2307/2290640
[24] Li, K. C. (1997). Nonlinear confounding in high-dimensional regression. Ann. Statist. 25 577--612. · Zbl 0873.62071 · doi:10.1214/aos/1031833665
[25] Muirhead, R. J. (1982). Aspects of Multivariate Statistical Theory . Wiley, New York. · Zbl 0556.62028
[26] Peters, B. C., Redner, R. and Decell, H. P. (1978). Characterizations of linear sufficient statistics. Sankhyā Ser. A 40 303--309. · Zbl 0422.62005
[27] Rao, C. R. (1965). Linear Statistical Inference and Its Applications . Wiley, New York. · Zbl 0137.36203
[28] Schott, J. (1994). Determining the dimensionality in sliced inverse regression. J. Amer. Statist. Assoc. 89 141--148. · Zbl 0791.62069 · doi:10.2307/2291210
[29] Xia, Y., Tong, H., Li, W. K. and Zhu, L.-X. (2002). An adaptive estimation of dimension reduction space. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 363--410. · Zbl 1091.62028 · doi:10.1111/1467-9868.03411
[30] Zhu, L.-X. and Fang, K.-T. (1996). Asymptotics for kernel estimate of sliced inverse regression. Ann. Statist. 24 1053--1068. · Zbl 0864.62027 · doi:10.1214/aos/1032526955
[31] Zhu, L.-X. and Ng, K. W. (1995). Asymptotics of sliced inverse regression. Statist. Sinica 5 727--736. · Zbl 0824.62036