×

Rho-estimators revisited: general theory and applications. (English) Zbl 1407.62169

Summary: Following Y. Baraud et al. [Invent. Math. 207, No. 2, 425–517 (2017; Zbl 1373.62141)], we pursue our attempt to design a robust universal estimator of the joint distribution of \(n\) independent (but not necessarily i.i.d.) observations for an Hellinger-type loss. Given such observations with an unknown joint distribution \(\mathbf{P}\) and a dominated model \(\mathscr{Q}\) for \(\mathbf{P}\), we build an estimator \(\widehat{\mathbf{P}}\) based on \(\mathscr{Q}\) (a \(\rho\)-estimator) and measure its risk by an Hellinger-type distance. When \(\mathbf{P}\) does belong to the model, this risk is bounded by some quantity which relies on the local complexity of the model in a vicinity of \(\mathbf{P}\). In most situations, this bound corresponds to the minimax risk over the model (up to a possible logarithmic factor). When \(\mathbf{P}\) does not belong to the model, its risk involves an additional bias term proportional to the distance between \(\mathbf{P}\) and \(\mathscr{Q}\), whatever the true distribution \(\mathbf{P}\). From this point of view, this new version of \(\rho\)-estimators improves upon the previous one described in [loc. cit.] which required that \(\mathbf{P}\) be absolutely continuous with respect to some known reference measure. Further additional improvements have been brought as compared to the former construction. In particular, it provides a very general treatment of the regression framework with random design as well as a computationally tractable procedure for aggregating estimators. We also give some conditions for the maximum likelihood estimator to be a \(\rho\)-estimator. Finally, we consider the situation where the statistician has at her or his disposal many different models and we build a penalized version of the \(\rho\)-estimator for model selection and adaptation purposes. In the regression setting, this penalized estimator not only allows one to estimate the regression function but also the distribution of the errors.

MSC:

62G35 Nonparametric robustness
62G05 Nonparametric estimation
62G07 Density estimation
62G08 Nonparametric regression and quantile regression
62C20 Minimax procedures in statistical decision theory

Citations:

Zbl 1373.62141
PDF BibTeX XML Cite
Full Text: DOI arXiv Euclid

References:

[1] Audibert, J.-Y. and Catoni, O. (2011). Robust linear least squares regression. Ann. Statist.39 2766–2794. · Zbl 1231.62126
[2] Baraud, Y. (2016). Bounding the expectation of the supremum of an empirical process over a (weak) VC-major class. Electron. J. Stat.10 1709–1728. · Zbl 1385.60038
[3] Baraud, Y. and Birgé, L. (2016). Rho-estimators for shape restricted density estimation. Stochastic Process. Appl.126 3888–3912. · Zbl 1419.62070
[4] Baraud, Y. and Birgé, L. (2018). Supplement to “Rho-estimators revisited: General theory and applications.” DOI:10.1214/17-AOS1675SUPP.
[5] Baraud, Y., Birgé, L. and Sart, M. (2017). A new method for estimation and model selection: \(ρ\)-estimation. Invent. Math.207 425–517. · Zbl 1373.62141
[6] Birgé, L. (1983). Approximation dans les espaces métriques et théorie de l’estimation. Z. Wahrsch. Verw. Gebiete65 181–237. · Zbl 0506.62026
[7] Birgé, L. (2006). Model selection via testing: An alternative to (penalized) maximum likelihood estimators. Ann. Inst. Henri Poincaré Probab. Stat.42 273–325.
[8] Birgé, L. and Massart, P. (1998). Minimum contrast estimators on sieves: Exponential bounds and rates of convergence. Bernoulli4 329–375. · Zbl 0954.62033
[9] Giné, E. and Koltchinskii, V. (2006). Concentration inequalities and asymptotic results for ratio type empirical processes. Ann. Probab.34 1143–1216. · Zbl 1152.60021
[10] Györfi, L., Kohler, M., Krzyżak, A. and Walk, H. (2002). A Distribution-Free Theory of Nonparametric Regression. Springer, New York.
[11] Koltchinskii, V. (2006). Local Rademacher complexities and oracle inequalities in risk minimization. Ann. Statist.34 2593–2656. · Zbl 1118.62065
[12] Le Cam, L. (1973). Convergence of estimates under dimensionality restrictions. Ann. Statist.1 38–53. · Zbl 0255.62006
[13] Le Cam, L. (1975). On local and global properties in the theory of asymptotic normality of experiments. In Stochastic Processes and Related Topics (Proc. Summer Res. Inst. Statist. Inference for Stochastic Processes, Indiana Univ., Bloomington, Ind., 1974, Vol. 1; Dedicated to Jerzy Neyman) 13–54. Academic Press, New York.
[14] Le Cam, L. (1990). Maximum likelihood: An introduction. Int. Stat. Rev.58 153–171. · Zbl 0715.62045
[15] Pollard, D. (1984). Convergence of Stochastic Processes. Springer, New York. · Zbl 0544.60045
[16] Sart, M. (2017). Estimating the conditional density by histogram type estimators and model selection. ESAIM, Probab. Stat.21 34–55. · Zbl 1453.62447
[17] van de Geer, S. A. (2000). Applications of Empirical Process Theory. Cambridge Series in Statistical and Probabilistic Mathematics6. Cambridge Univ. Press, Cambridge. · Zbl 0953.62049
[18] van der Vaart, A. W. and Wellner, J. A. (1996). Weak Convergence and Empirical Processes. With Applications to Statistics. Springer, New York. · Zbl 0862.60002
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.