×

Divide and conquer in nonstandard problems and the super-efficiency phenomenon. (English) Zbl 1416.62259

The paper is a theoretical study of the divide and conquer approach for nonstandard statistical problems involving nonparametric estimation of a monotone function. These problems exhibit nonstandard limits and slow rates of convergence in the presence of very large data. Such examples include isotonic regression problems and cube-root asymptotics.

MSC:

62G20 Asymptotic properties of nonparametric inference
62G08 Nonparametric regression and quantile regression
62F30 Parametric inference under constraints
PDFBibTeX XMLCite
Full Text: DOI arXiv Euclid

References:

[1] Banerjee, M. (2005). Likelihood ratio tests under local and fixed alternatives in monotone function problems. Scand. J. Stat.32 507-525. · Zbl 1091.62030 · doi:10.1111/j.1467-9469.2005.00458.x
[2] Banerjee, M. (2007). Likelihood based inference for monotone response models. Ann. Statist.35 931-956. · Zbl 1133.62328 · doi:10.1214/009053606000001578
[3] Banerjee, M. (2008). Estimating monotone, unimodal and U-shaped failure rates using asymptotic pivots. Statist. Sinica 467-492. · Zbl 1135.62079
[4] Banerjee, M., Durot, C. and Sen, B. (2019). Supplement to “Divide and conquer in nonstandard problems and the super-efficiency phenomenon.” DOI:10.1214/17-AOS1633SUPP. · Zbl 1416.62259
[5] Banerjee, M. and McKeague, I. W. (2007). Confidence sets for split points in decision trees. Ann. Statist.35 543-574. · Zbl 1117.62037 · doi:10.1214/009053606000001415
[6] Banerjee, M. and Wellner, J. A. (2005). Confidence intervals for current status data. Scand. J. Stat.32 405-424. · Zbl 1087.62107 · doi:10.1111/j.1467-9469.2005.00454.x
[7] Brown, L. D., Low, M. G. and Zhao, L. H. (1997). Superefficiency in nonparametric function estimation. Ann. Statist.25 2607-2625. · Zbl 0895.62043 · doi:10.1214/aos/1030741087
[8] Brunk, H. D. (1955). Maximum likelihood estimates of monotone parameters. Ann. Math. Stat.26 607-616. · Zbl 0066.38503 · doi:10.1214/aoms/1177728420
[9] Brunk, H. D. (1970). Estimation of isotonic regression. In Nonparametric Techniques in Statistical Inference 177-197. Cambridge Univ. Press, London.
[10] Chernoff, H. (1964). Estimation of the mode. Ann. Inst. Statist. Math.16 31-41. · Zbl 0212.21802 · doi:10.1007/BF02868560
[11] Durot, C. (2002). Sharp asymptotics for isotonic regression. Probab. Theory Related Fields122 222-240. · Zbl 0992.60028 · doi:10.1007/s004400100171
[12] Durot, C. (2008). Monotone nonparametric regression with random design. Math. Methods Statist.17 327-341. · Zbl 1231.62066 · doi:10.3103/S1066530708040042
[13] Durot, C. and Lopuhaä, H. P. (2014). A Kiefer-Wolfowitz type of result in a general setting, with an application to smooth monotone estimation. Electron. J. Stat.8 2479-2513. · Zbl 1309.62065 · doi:10.1214/14-EJS958
[14] Durot, C. and Thiébot, K. (2006). Bootstrapping the shorth for regression. ESAIM Probab. Stat.10 216-235. · Zbl 1187.62034 · doi:10.1051/ps:2006007
[15] Grenander, U. (1956). On the theory of mortality measurement, part II. Skand. Aktuarietidskr.39 125-153. · Zbl 0077.33715
[16] Groeneboom, P. and Jongbloed, G. (2014). Nonparametric Estimation Under Shape Constraints: Estimators, Algorithms and Asymptotics. Cambridge Series in Statistical and Probabilistic Mathematics38. Cambridge Univ. Press, Cambridge. · Zbl 1338.62008
[17] Groeneboom, P. and Wellner, J. A. (1992). Information Bounds and Nonparametric Maximum Likelihood Estimation. DMV Seminar19. Birkhäuser, Basel. · Zbl 0757.62017
[18] Huang, J. and Wellner, J. A. (1995). Estimation of a monotone density or monotone hazard under random censoring. Scand. J. Stat. 3-33. · Zbl 0827.62032
[19] Kim, J. and Pollard, D. (1990). Cube root asymptotics. Ann. Statist.18 191-219. · Zbl 0703.62063 · doi:10.1214/aos/1176347498
[20] Li, R., Lin, Dennis, K. J. and Li, B. (2013). Statistical inference in massive data sets. Appl. Stoch. Models Bus. Ind.29 399-409.
[21] Manski, C. F. (1975). Maximum score estimation of the stochastic utility model of choice. J. Econometrics3 205-228. · Zbl 0307.62068 · doi:10.1016/0304-4076(75)90032-9
[22] Massart, P. (1990). The tight constant in the Dvoretzky-Kiefer-Wolfowitz inequality. Ann. Probab.18 1269-1283. · Zbl 0713.62021 · doi:10.1214/aop/1176990746
[23] Prakasa Rao, B. L. S. (1969). Estimation of a unimodal density. Sankhyā, Ser. A31 23-36. · Zbl 0181.45901
[24] Revuz, D. and Yor, M. (2013). Continuous Martingales and Brownian Motion. Grundlehren der mathematischen Wissenschaften293. Springer, Berlin.
[25] Robertson, T., Wright, F. T. and Dykstra, R. L. (1988). Order Restricted Statistical Inference. Wiley Series in Probability and Mathematical Statistics: Probability and Mathematical Statistics. Wiley, Chichester. · Zbl 0645.62028
[26] Rousseeuw, P. J. (1984). Least median of squares regression. J. Amer. Statist. Assoc.79 871-880. · Zbl 0547.62046 · doi:10.1080/01621459.1984.10477105
[27] Shi, C., Lu, W. and Song, R. (2018). A massive data framework for M-estimators with cubic-rate. J. Amer. Statist. Assoc. To appear. · Zbl 1409.62105
[28] Tsybakov, A. B. (2009). Introduction to Nonparametric Estimation. Springer Series in Statistics. Springer, New York. Revised and extended from the 2004 French original; translated by Vladimir Zaiats.
[29] Wright, F. T. (1981). The asymptotic behavior of monotone regression estimates. Ann. Statist.9 443-448. · Zbl 0471.62062 · doi:10.1214/aos/1176345411
[30] Zhang, Y., Duchi, J. and Wainwright, M. (2013). Divide and conquer kernel ridge regression. In Conference on Learning Theory 592-617.
[31] Zhao, T., Cheng, G. and Liu, H. (2016). A partially linear framework for massive heterogeneous data. Ann. Statist.44 1400-1437. · Zbl 1358.62050 · doi:10.1214/15-AOS1410
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.