×

zbMATH — the first resource for mathematics

Robust parameter estimation with a small bias against heavy contamination. (English) Zbl 1169.62010
Summary: We consider robust parameter estimation based on a certain cross entropy and divergence. The robust estimate is defined as the minimizer of the empirically estimated cross entropy. It is shown that the robust estimate can be regarded as a kind of projection from the viewpoint of a Pythagorean relation based on the divergence. This property implies that the bias caused by outliers can become sufficiently small even in the case of heavy contamination. It is seen that the asymptotic variance of the robust estimator is naturally overweighted in proportion to the ratio of contamination. One may surmise that another form of cross entropy can present the same behavior as that discussed above. It can be proved under some conditions that no cross entropy can present the same behavior except for the cross entropy considered here and its monotone transformation.

MSC:
62F10 Point estimation
62F35 Robustness and adaptive procedures (parametric inference)
62B10 Statistical aspects of information-theoretic topics
62F12 Asymptotic properties of parametric estimators
Software:
robustbase
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Amari, S., Differential-geometrical methods in statistics, (1985), Springer Berlin · Zbl 0559.62001
[2] Basu, A.; Harris, I.R.; Hjort, N.L.; Jones, M.C., Robust and efficient estimation by minimising a density power divergence, Biometrika, 85, 549-559, (1998) · Zbl 0926.62021
[3] Croux, C.; Rousseeuw, P.J., Time-efficient algorithms for two highly robust estimators of scale, Comput. statist., 2, 411-428, (1992)
[4] Hampel, F.R.; Ronchetti, E.M.; Rousseeuw, P.J.; Stahel, W.A., Robust statistics: the approach based on influence functions, (1986), Wiley New York · Zbl 0593.62027
[5] Huber, P.J., Robust statistics, (1981), Wiley New York · Zbl 0536.62025
[6] Jones, M.C.; Hjort, N.L.; Harris, I.R.; Basu, A., A comparison of related density-based minimum divergence estimators, Biometrika, 88, 865-873, (2001) · Zbl 1180.62047
[7] Komaki, F., On asymptotic properties of predictive distributions, Biometrika, 83, 299-313, (1996) · Zbl 0864.62007
[8] Lebanon, G.; Lafferty, J., Boosting and maximum likelihood for exponential models, Adv. NIPS, 14, 447-454, (2002)
[9] Lehman, E.L., Elements of large-sample theory, (1999), Springer New York
[10] Lindsay, B.G., Efficiency versus robustness: the case for minimum Hellinger distance and related methods, Ann. statist., 22, 1081-1114, (1994) · Zbl 0807.62030
[11] Maronna, R.A.; Martin, R.D.; Yohai, V.J., Robust statistics: theory and methods, (2006), Wiley New York · Zbl 1094.62040
[12] Miyamura, M.; Kano, Y., Robust Gaussian graphical modeling, J. multivariate anal., 97, 1525-1550, (2006) · Zbl 1093.62038
[13] Murata, N.; Takenouchi, T.; Kanamori, T.; Eguchi, S., Information geometry of U-boost and Bregman divergence, Neural comput., 16, 1437-1481, (2004) · Zbl 1102.68489
[14] Rousseeuw, P.J.; Croux, C., Alternatives to the Median absolute deviation, J. amer. statist. assoc., 88, 1273-1283, (1993) · Zbl 0792.62025
[15] Scott, D.W., Parametric statistical modeling by minimum integrated square error, Technometrics, 43, 274-285, (2001)
[16] Windham, M.P., Robustifying model Fitting, J. roy. statist. soc. B, 43, 599-609, (1995) · Zbl 0827.62030
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.