zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
Dimension reduction for nonelliptically distributed predictors. (English) Zbl 1160.62050
Summary: Sufficient dimension reduction methods often require stringent conditions on the joint distribution of the predictor, or, when such conditions are not satisfied, rely on marginal transformations or reweighting to fulfill them approximately. For example, a typical dimension reduction method would require the predictor to have elliptical or even multivariate normal distributions. We reformulate the commonly used dimension reduction methods, via the notion of “central solution space” to circumvent the requirements of such strong assumptions, while at the same time preserve the desirable properties of the classical methods, such as $\sqrt n$-consistency and asymptotic normality. Imposing elliptical distributions or even stronger assumptions on predictors is often considered as the necessary tradeoff for overcoming the “curse of dimensionality”, but the developments of this paper show that this need not be the case. The new methods will be compared with existing methods by simulation and applied to a data set.

MSC:
62H12Multivariate estimation
62G08Nonparametric regression
62G20Nonparametric asymptotic efficiency
62G09Nonparametric statistical resampling methods
62E20Asymptotic distribution theory in statistics
65C60Computational problems in statistics
WorldCat.org
Full Text: DOI arXiv
References:
[1] Allen, D. M. (1974). The relationship between variable selection and data augmentation and a method for prediction. Technometrics 16 125-127. JSTOR: · Zbl 0286.62044 · doi:10.2307/1267500 · http://links.jstor.org/sici?sici=0040-1706%28197402%2916%3A1%3C125%3ATRBVSA%3E2.0.CO%3B2-W&origin=euclid
[2] Bellman, R. (1961). Adaptive Control Processes : A Guided Tour . Princeton Univ. Press. · Zbl 0103.12901
[3] Bickel, P., Klaassen, C. A. J., Ritov, Y. and Wellner, J. (1993). Efficient and Adaptive Inference in Semi-Parametric Models . Johns Hopkins Univ. Press, Baltimore. · Zbl 0786.62001
[4] Bura, E. and Cook, R. D. (2001). Estimating the structural dimension of regressions via parametric inverse regression. J. Roy. Statist. Soc. Ser. B 63 393-410. JSTOR: · Zbl 0979.62041 · doi:10.1111/1467-9868.00292 · http://links.jstor.org/sici?sici=1369-7412%282001%2963%3A2%3C393%3AETSDOR%3E2.0.CO%3B2-T&origin=euclid
[5] Chiaromonte, F. and Cook, R. D. (2001). Sufficient dimension reduction and graphics in regression. Ann. Inst. Statist. Math. 54 768-795. · Zbl 1047.62066 · doi:10.1023/A:1022411301790
[6] Cook, R. D. (1994). Using dimension-reduction subspaces to identify important inputs in models of physical systems. In Amer. Statist. Assoc. Proceedings of the Section on Physical and Engineering Sciences . Amer. Statist. Assoc., Washington, DC.
[7] Cook, R. D. (1996). Graphics for regressions with a binary response. J. Amer. Statist. Assoc. 91 983-992. JSTOR: · Zbl 0882.62060 · doi:10.2307/2291717 · http://links.jstor.org/sici?sici=0162-1459%28199609%2991%3A435%3C983%3AGFRWAB%3E2.0.CO%3B2-%23&origin=euclid
[8] Cook, R. D. (1998). Regression Graphics : Ideas for Studying Regressions Through Graphics . Wiley, New York. · Zbl 0903.62001
[9] Cook, R. D. (2007). Fisher lecture: Dimension reduction for regression (with discussion). Statist. Sci. 22 1-26. · Zbl 1246.62148
[10] Cook, R. D. and Nachtsheim, C. J. (1994). Reweighting to achieve elliptically contoured covariates in regression. J. Amer. Statist. Assoc. 89 592-599. · Zbl 0799.62078 · doi:10.2307/2290862
[11] Cook, R. D. and Ni, L. (2005). Sufficient dimension reduction via inverse regression: A minimum discrepancy approach. J. Amer. Statist. Assoc. 100 410-428. · Zbl 1117.62312 · doi:10.1198/016214504000001501 · http://miranda.asa.catchword.org/vl=1113676/cl=15/nw=1/rpsv/cw/asa/01621459/v100n470/s6/p410
[12] Cook, R. D. and Ni, L. (2006). Using intra-slice covariances for improved estimation of central subspace in regression. Biometrika 93 65-74. · Zbl 1152.62019 · doi:10.1093/biomet/93.1.65
[13] Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction,” by K.-C. Li. J. Amer. Statist. Assoc. 86 328-332. JSTOR: · Zbl 0742.62044 · doi:10.2307/2290563 · http://links.jstor.org/sici?sici=0162-1459%28199106%2986%3A414%3C316%3ASIRFDR%3E2.0.CO%3B2-V&origin=euclid
[14] Dawid, A. P. (1979). Conditional independence in statistical theory (with discussion). J. Roy. Statist. Soc. Ser. B 41 1-31. JSTOR: · Zbl 0408.62004 · http://links.jstor.org/sici?sici=0035-9246%281979%2941%3A1%3C1%3ACIIST%3E2.0.CO%3B2-T&origin=euclid
[15] Eaton, M. L. (1986). A characterization of spherical distributions. J. Multivariate Anal. 34 439-446. · Zbl 0596.62057 · doi:10.1016/0047-259X(86)90083-7
[16] Fernholz, L. T. (1983). Von Mises Calculus for Statistical Functionals . Springer, New York. · Zbl 0525.62031
[17] Ferre, L. and Yao, A. F. (2005). Smooth function inverse regression. Statist. Sinica 15 665-683. · Zbl 1086.62054
[18] Fung, K. F., He, X., Liu, L. and Shi, P. (2002). Dimension reduction based on canonical correlation. Statist. Sinica 12 1093-1113. · Zbl 1004.62058
[19] Hall, W. J. and Mathiason, D. J. (1990). On large-sample estimation and testing in parametric models. Internat. Statist. Rev. 58 77-97. · Zbl 0715.62058 · doi:10.2307/1403475
[20] Li, B. and Wang, S. (2007). On directional regression for dimension reduction. J. Amer. Statist. Assoc. 102 997-1008. · Zbl 05564427 · doi:10.1198/016214507000000536
[21] Li, K. C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316-342. JSTOR: · Zbl 0742.62044 · doi:10.2307/2290563 · http://links.jstor.org/sici?sici=0162-1459%28199106%2986%3A414%3C316%3ASIRFDR%3E2.0.CO%3B2-V&origin=euclid
[22] Li, K. C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. J. Amer. Statist. Assoc. 87 1025-1039. JSTOR: · Zbl 0765.62003 · doi:10.2307/2290640 · http://links.jstor.org/sici?sici=0162-1459%28199212%2987%3A420%3C1025%3AOPHDFD%3E2.0.CO%3B2-J&origin=euclid
[23] Li, K. C. and Duan, N. (1989). Regression analysis under link violation. Ann. Statist. 17 1009-1052. · Zbl 0753.62041 · doi:10.1214/aos/1176347254
[24] McCullagh, P. (1987). Tensor Methods in Statistics . Chapman and Hall, London. · Zbl 0732.62003
[25] Nelder, J. A. and Mead, R. (1965). A simplex method for function minimization. Comput. J. 7 308-313. · Zbl 0229.65053 · doi:10.1093/comjnl/7.4.308
[26] Stone, M. (1974). Cross-validatory choice and assessment of statistical predictions. J. Roy. Statist. Soc. Ser. B 36 111-147. JSTOR: · Zbl 0308.62063 · http://links.jstor.org/sici?sici=0035-9246%281974%2936%3A2%3C111%3ACCAAOS%3E2.0.CO%3B2-W&origin=euclid
[27] van der Vaart, A. W. (1998). Asymptotic Statistics . Cambridge Univ. Press. · Zbl 0910.62001 · doi:10.1017/CBO9780511802256
[28] von Mises, R. (1947). On the asymptotic distribution of differentiable statistical functions. Ann. Math. Statist. 18 309-348. · Zbl 0037.08401 · doi:10.1214/aoms/1177730385
[29] Xia, Y., Tong, H., Li, W. K. and Zhu, L. X. (2002). An adaptive estimation of optimal regression subspace. J. Roy. Statist. Soc. Ser. B 64 363-410. JSTOR: · Zbl 1091.62028 · doi:10.1111/1467-9868.03411 · http://links.jstor.org/sici?sici=1369-7412%282002%2964%3A3%3C363%3AAAEODR%3E2.0.CO%3B2-C&origin=euclid
[30] Yin, X., Li, B. and Cook, R. D. (2008). Successive direction extraction for estimating the central subspace in a multiple-index regression. J. Multivariate Anal. 99 1733-1757. · Zbl 1144.62030 · doi:10.1016/j.jmva.2008.01.006
[31] Zhu, L.-X. and Fang, K.-T. (1996). Asymptotics for kernel estimate of sliced inverse regression. Ann. Statist. 3 1053-1068. · Zbl 0864.62027 · doi:10.1214/aos/1032526955