Lee, Kuang-Yao; Li, Lexin Functional sufficient dimension reduction through average Fréchet derivatives. (English) Zbl 1486.62115 Ann. Stat. 50, No. 2, 904-929 (2022). Summary: Sufficient dimension reduction (SDR) embodies a family of methods that aim for reduction of dimensionality without loss of information in a regression setting. In this article, we propose a new method for nonparametric function-on-function SDR, where both the response and the predictor are a function. We first develop the notions of functional central mean subspace and functional central subspace, which form the population targets of our functional SDR. We then introduce an average Fréchet derivative estimator, which extends the gradient of the regression function to the operator level and enables us to develop estimators for our functional dimension reduction spaces. We show the resulting functional SDR estimators are unbiased and exhaustive, and more importantly, without imposing any distributional assumptions such as the linearity or the constant variance conditions that are commonly imposed by all existing functional SDR methods. We establish the uniform convergence of the estimators for the functional dimension reduction spaces, while allowing both the number of Karhunen-Loève expansions and the intrinsic dimension to diverge with the sample size. We demonstrate the efficacy of the proposed methods through both simulations and two real data examples. Cited in 3 Documents MSC: 62G08 Nonparametric regression and quantile regression 62G20 Asymptotic properties of nonparametric inference 62H12 Estimation in multivariate analysis 62R10 Functional data analysis Keywords:consistency; exhaustiveness; functional central mean subspace; functional central subspace; function-on-function regression; reproducing kernel Hilbert space; unbiasedness Software:fda (R) PDFBibTeX XMLCite \textit{K.-Y. Lee} and \textit{L. Li}, Ann. Stat. 50, No. 2, 904--929 (2022; Zbl 1486.62115) Full Text: DOI References: [1] AMINI, A. A. and WAINWRIGHT, M. J. (2012). Sampled forms of functional PCA in reproducing kernel Hilbert spaces. Ann. Statist. 40 2483-2510. · Zbl 1373.62289 · doi:10.1214/12-AOS1033 [2] CONWAY, J. B. (2010). A Course in Functional Analysis, 2nd ed. Springer, New York. [3] COOK, R. D. and LI, B. (2002). Dimension reduction for conditional mean in regression. Ann. Statist. 30 455-474. · Zbl 1012.62035 · doi:10.1214/aos/1021379861 [4] COOK, R. D. and WEISBERG, S. (1991). Discussion of “Sliced inverse regression for dimension reduction”. J. Amer. Statist. Assoc. 86 328-332. · Zbl 1353.62037 [5] FANAEE-T, H. and GAMA, J. (2014). Event labeling combining ensemble detectors and background knowledge. Prog. Artif. Intell. 2 113-127. [6] FERRÉ, L. and YAO, A. F. (2003). Functional sliced inverse regression analysis. Statistics 37 475-488. · Zbl 1032.62052 · doi:10.1080/0233188031000112845 [7] FERRÉ, L. and YAO, A.-F. (2005). Smoothed functional inverse regression. Statist. Sinica 15 665-683. · Zbl 1086.62054 [8] FUKUMIZU, K., BACH, F. R. and JORDAN, M. I. (2009). Kernel dimension reduction in regression. Ann. Statist. 37 1871-1905. · Zbl 1168.62049 · doi:10.1214/08-AOS637 [9] FUKUMIZU, K. and LENG, C. (2014). Gradient-based kernel dimension reduction for regression. J. Amer. Statist. Assoc. 109 359-370. · Zbl 1367.62118 · doi:10.1080/01621459.2013.838167 [10] HARDLE, W. (1989). Investigating smooth multiple regression by the method of average derivatives. J. Amer. Statist. Assoc. 84 986-995. · Zbl 0703.62052 [11] Härdle, W. and Stoker, T. M. (1989). Investigating smooth multiple regression by the method of average derivatives. J. Amer. Statist. Assoc. 84 986-995. · Zbl 0703.62052 [12] HSING, T. (1999). Nearest neighbor inverse regression. Ann. Statist. 27 697-731. · Zbl 0951.62034 · doi:10.1214/aos/1018031213 [13] HSING, T. and REN, H. (2009). An RKHS formulation of the inverse regression dimension-reduction problem. Ann. Statist. 37 726-755. · Zbl 1162.62053 · doi:10.1214/07-AOS589 [14] JIANG, C.-R., YU, W. and WANG, J.-L. (2014). Inverse regression for longitudinal data. Ann. Statist. 42 563-591. · Zbl 1296.62073 · doi:10.1214/13-AOS1193 [15] Kim, J. S., Staicu, A.-M., Maity, A., Carroll, R. J. and Ruppert, D. (2018). Additive function-on-function regression. J. Comput. Graph. Statist. 27 234-244. · Zbl 07498981 · doi:10.1080/10618600.2017.1356730 [16] LEE, Y.-J. and HUANG, S.-Y. (2007). Reduced support vector machines: A statistical theory. IEEE Trans. Neural Netw. 18 1-13. · doi:10.1109/TNN.2006.883722 [17] LEE, K.-Y. and LI, L. (2022). Supplement to “Functional sufficient dimension reduction through average Fréchet derivatives.” https://doi.org/10.1214/21-AOS2131SUPP [18] Li, K.-C. (1991). Sliced inverse regression for dimension reduction. J. Amer. Statist. Assoc. 86 316-342. · Zbl 0742.62044 [19] LI, B. (2018a). Sufficient Dimension Reduction: Methods and Applications with R. Monographs on Statistics and Applied Probability 161. CRC Press, Boca Raton, FL. · Zbl 1408.62011 · doi:10.1201/9781315119427 [20] LI, B. (2018b). Linear operator-based statistical analysis: A useful paradigm for big data. Canad. J. Statist. 46 79-103. · Zbl 1466.62363 · doi:10.1002/cjs.11329 [21] LI, B. and SONG, J. (2017). Nonlinear sufficient dimension reduction for functional data. Ann. Statist. 45 1059-1095. · Zbl 1371.62003 · doi:10.1214/16-AOS1475 [22] LI, B. and WANG, S. (2007). On directional regression for dimension reduction. J. Amer. Statist. Assoc. 102 997-1008. · Zbl 1469.62300 · doi:10.1198/016214507000000536 [23] LI, B., WEN, S. and ZHU, L. (2008). On a projective resampling method for dimension reduction with multivariate responses. J. Amer. Statist. Assoc. 103 1177-1186. · Zbl 1205.62067 · doi:10.1198/016214508000000445 [24] LI, B., ZHA, H. and CHIAROMONTE, F. (2005). Contour regression: A general approach to dimension reduction. Ann. Statist. 33 1580-1616. · Zbl 1078.62033 · doi:10.1214/009053605000000192 [25] LIN, Q., ZHAO, Z. and LIU, J. S. (2019). Sparse sliced inverse regression via Lasso. J. Amer. Statist. Assoc. 114 1726-1739. · Zbl 1428.62320 · doi:10.1080/01621459.2018.1520115 [26] LIN, Q., LI, X., HUANG, D. and LIU, J. S. (2017). On the optimality of sliced inverse regression in high dimensions. · Zbl 1464.62337 [27] Luo, R. and Qi, X. (2017). Function-on-function linear regression by signal compression. J. Amer. Statist. Assoc. 112 690-705. · doi:10.1080/01621459.2016.1164053 [28] LUO, R. and QI, X. (2019). Interaction model and model selection for function-on-function regression. J. Comput. Graph. Statist. 28 309-322. · Zbl 07499055 · doi:10.1080/10618600.2018.1514310 [29] MA, Y. and ZHU, L. (2012). A semiparametric approach to dimension reduction. J. Amer. Statist. Assoc. 107 168-179. · Zbl 1261.62037 · doi:10.1080/01621459.2011.646925 [30] MA, Y. and ZHU, L. (2013). Efficient estimation in sufficient dimension reduction. Ann. Statist. 41 250-268. · Zbl 1347.62089 · doi:10.1214/12-AOS1072 [31] MÜLLER, H.-G. and YAO, F. (2008). Functional additive models. J. Amer. Statist. Assoc. 103 1534-1544. · Zbl 1286.62040 · doi:10.1198/016214508000000751 [32] Ramsay, J. O. and Silverman, B. W. (2005). Functional Data Analysis, 2nd ed. Springer Series in Statistics. Springer, New York. · Zbl 1079.62006 [33] Reimherr, M., Sriperumbudur, B. and Taoufik, B. (2018). Optimal prediction for additive function-on-function regression. Electron. J. Stat. 12 4571-4601. · Zbl 1433.62112 · doi:10.1214/18-EJS1505 [34] Steinwart, I. and Christmann, A. (2008). Support Vector Machines. Springer, New York. · Zbl 1203.68171 [35] SUN, X., DU, P., WANG, X. and MA, P. (2018). Optimal penalized function-on-function regression under a reproducing kernel Hilbert space framework. J. Amer. Statist. Assoc. 113 1601-1611. · Zbl 1409.62137 · doi:10.1080/01621459.2017.1356320 [36] WANG, J. L., CHIOU, J. M. and MULLER, H. G. (2016). Functional data analysis. Annu. Rev. Stat. Appl. 3 257-295. [37] WANG, G., LIN, N. and ZHANG, B. (2013). Functional contour regression. J. Multivariate Anal. 116 1-13. · Zbl 1277.62119 · doi:10.1016/j.jmva.2012.11.005 [38] WANG, G., ZHOU, Y., FENG, X.-N. and ZHANG, B. (2015). The hybrid method of FSIR and FSAVE for functional effective dimension reduction. Comput. Statist. Data Anal. 91 64-77. · Zbl 1468.62205 · doi:10.1016/j.csda.2015.05.011 [39] WEIDMANN, J. (1980). Linear Operators in Hilbert Spaces. Graduate Texts in Mathematics 68. Springer, New York-Berlin. [40] XIA, Y. (2007). A constructive approach to the estimation of dimension reduction directions. Ann. Statist. 35 2654-2690. · Zbl 1360.62196 · doi:10.1214/009053607000000352 [41] XIA, Y., TONG, H., LI, W. K. and ZHU, L.-X. (2002). An adaptive estimation of dimension reduction space. J. R. Stat. Soc. Ser. B. Stat. Methodol. 64 363-410. · Zbl 1091.62028 · doi:10.1111/1467-9868.03411 [42] YAO, F., LEI, E. and WU, Y. (2015). Effective dimension reduction for sparse functional data. Biometrika 102 421-437. · Zbl 1452.62996 · doi:10.1093/biomet/asv006 [43] Yao, F., Müller, H.-G. and Wang, J.-L. (2005). Functional linear regression analysis for longitudinal data. Ann. Statist. 33 2873-2903. · Zbl 1084.62096 · doi:10.1214/009053605000000660 [44] YIN, X. and LI, B. (2011). Sufficient dimension reduction based on an ensemble of minimum average variance estimators. Ann. Statist. 39 3392-3416. · Zbl 1246.62141 · doi:10.1214/11-AOS950 [45] YIN, X., LI, B. and COOK, R. D. (2008). Successive direction extraction for estimating the central subspace in a multiple-index regression. J. Multivariate Anal. 99 1733-1757 · Zbl 1144.62030 · doi:10.1016/j.jmva.2008.01.006 This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.