×

Improving estimated sufficient summary plots in dimension reduction using minimization criteria based on initial estimates. (English) Zbl 1347.65029

Summary: In this paper we show that estimated sufficient summary plots can be greatly improved when the dimension reduction estimates are adjusted according to minimization of an objective function. The dimension reduction methods primarily considered are ordinary least squares, sliced inverse regression, sliced average variance estimates and principal Hessian directions. Some consideration to minimum average variance estimation is also given. Simulations support the usefulness of the approach and three data sets are considered with an emphasis on two- and three-dimensional estimated sufficient summary plots.

MSC:

62-08 Computational methods for problems pertaining to statistics
62H25 Factor analysis and principal components; correspondence analysis
62J99 Linear inference, regression
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Adragni KP, Raim A (2014) ldr: an R Software package for likelihood-based sufficient dimension reduction. J Stat Softw 61:1-21
[2] Castlehouse H (2008) The biogeochemical controls on arsenic mobilisation in a geogenic arsenic rich soil. PhD Dissertation, University of Sheffield · Zbl 1271.62143
[3] Chang J, Olive DJ (2007) Resistant dimension reduction, (Culver Stockton College and Southern Illinois University, unpublished, 2007). www.math.siu.edu/olive/ppresdr.pdf · Zbl 1390.62064
[4] Cook RD (1996) Graphics for regressions with a binary response. J Am Stat Assoc 91:983-992 · Zbl 0882.62060
[5] Cook RD (1998a) Principal Hessian directions revisited. J Am Stat Assoc 93:84-100 with comments by Ker-Chau Li and a rejoinder by the author · Zbl 0922.62057
[6] Cook RD (1998b) Regression graphics: ideas for studying regressions through graphics. Wiley, New York · Zbl 0903.62001
[7] Cook RD (2007) Fisher lecture: dimension reduction in regression. Stat Sci 22:1-26 · Zbl 1246.62148
[8] Cook RD, Forzani L (2008) Principal fitted components for dimension reduction in regression. Stat Sci 23:485-501 · Zbl 1329.62274
[9] Cook R, Weisberg S (1991) Comment on “Sliced inverse regression for dimension reduction” by K.-C. Li. J Am Stat Assoc 86:328-332 · Zbl 1353.62037
[10] Enz R (1991) Prices and earnings around the globe. Union Bank of Switzerland, Zurich
[11] Garnham AL, Prendergast LA (2013) A note on least squares sensitivity in single-index model estimation and the benefits of response transformations. Electron J Stat 7:1983-2004 · Zbl 1293.62141
[12] Gather U, Hilker T, Becker C (2001) A robustified version of sliced inverse regression. In: Statistics in genetics and in the environmental sciences (Ascona, 1999). Trends Math., Birkhäuser, Basel, pp 147-157
[13] Gather U, Hilker T, Becker C (2002) A note on outlier sensitivity of sliced inverse regression. Statistics 36:271-281 · Zbl 1020.62023
[14] Huber PJ (1964) Robust estimation of a location parameter. Ann Math Stat 35:73-101 · Zbl 0136.39805
[15] Huber PJ (1973) Robust regression: asymptotics, conjectures and Monte Carlo. Ann Stat 1:799-821 · Zbl 0289.62033
[16] Li KC (1991) Sliced inverse regression for dimension reduction. J Am Stat Assoc 86:316-342 with discussion and a rejoinder by the author · Zbl 0742.62044
[17] Li KC (1992) On principal Hessian directions for data visualization and dimension reduction: another application of Stein’s lemma. J Am Stat Assoc 87:1025-1039 · Zbl 0765.62003
[18] Li KC, Duan N (1989) Regression analysis under link violation. Ann Stat 17:1009-1052 · Zbl 0753.62041
[19] Li L, Li B, Zhu LX (2010) Groupwise dimension reduction. J Am Stat Assoc 105:1188-1201 · Zbl 1390.62064
[20] Lin L (1989) A concordance correlation coefficient to evaluate reproducibility. Biometrics 45:255-268 · Zbl 0715.62114
[21] Liquet B, Saracco J (2012) A graphical tool for selecting the number of slices and the dimension of the model in SIR and SAVE approaches. Comput Stat 27:103-125 · Zbl 1304.65054
[22] Lue HH (2001) A study of sensitivity analysis on the method of principal Hessian directions. Comput Stat 16:109-130 · Zbl 1007.62055
[23] Prendergast LA (2007) Implications of influence function analysis for sliced inverse regression and sliced average variance estimation. Biometrika 94:585-601 · Zbl 1135.62047
[24] Prendergast LA (2008) Trimming influential observations for improved single-index model estimated sufficient summary plots. Comput Stat Data Anal 52:5319-5327 · Zbl 1452.62113
[25] Prendergast LA, Smith JA (2010) Influence functions for dimension reduction methods: an example influence study of principal Hessian direction analysis. Scand J Stat 37(4):588-611 · Zbl 1226.62064
[26] R Core Team (2014) R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org
[27] Rousseeuw P, Croux C, Todorov V, Ruckstuhl A, Salibian-Barrera M, Verbeke T, Koller M, Maechler M (2011) robustbase: basic robust statistics. R package version 0.8-0. http://CRAN.R-project.org/package=robustbase
[28] Shaker AJ, Prendergast LA (2011) Iterative application of dimension reduction methods. Electron J Stat 5:1471-1494 · Zbl 1271.62143
[29] Sheather SJ (2009) A modern approach to regression with R. Springer, New York · Zbl 1181.62101
[30] Tryfos P (1998) Methods for business analysis and forecasting: text & cases. Wiley, New York
[31] Venables WN, Ripley BD (2002) Modern applied statistics with S, 4th ed. Springer, New York. http://www.stats.ox.ac.uk/pub/MASS4 · Zbl 1006.62003
[32] Weisberg S (2002) Dimension reduction regression in R. J Stat Softw 7:1-22
[33] Xia Y, Tong H, Li WK, Zhu LX (2002) An adaptive estimation of dimension reduction space. J R Stat Soc Ser B Stat Methodol 64:363-410 · Zbl 1091.62028
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.