×

Existence, consistency and computer simulation for selected variants of minimum distance estimators. (English) Zbl 1449.62004

Summary: The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function \(f_0\) on the real line. It shows that the AMDE always exists when the bounded \(\phi\)-divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, \(n^{-1/2}\) consistency rate in any bounded \(\phi\)-divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family of densities is finite. A simulation experiment empirically studies the performance of the approximate minimum Kolmogorov estimator (AMKE) and some histogram-based variants of approximate minimum divergence estimators, like power type and Le Cam, under six distributions (Uniform, Normal, Logistic, Laplace, Cauchy, Weibull). A comparison with the standard estimators (moment/maximum likelihood/median) is provided for sample sizes \(n=10,20,50,120,250\). The simulation analyzes the behaviour of estimators through different families of distributions. It is shown that the performance of AMKE differs from the other estimators with respect to family type and that the AMKE estimators cope more easily with the Cauchy distribution than standard or divergence based estimators, especially for small sample sizes.

MSC:

62B05 Sufficient statistics and fields
62H30 Classification and discrimination; cluster analysis (statistical aspects)
PDF BibTeX XML Cite
Full Text: DOI Link

References:

[1] Mohamad, D. Al, Towards a better understanding of the dual representation of phi divergences., Statistical Papers (published on-line 2016.) · Zbl 1401.62031
[2] Barron, A. R., The convergence in information of probability density estimators., In: IEEE Int. Symp. Information Theory, Kobe 1988
[3] Beran, R., Minimum Hellinger distance estimator for parametric models., Ann. Statist. 5 (1977), 455-463 · Zbl 0381.62028
[4] Berger, A., Remark on separable spaces of probability measures., An. Math. Statist. 22 (1951), 119-120 · Zbl 0042.35802
[5] Broniatowski, M.; Toma, A.; Vajda, I., Decomposable pseudodistances and applications in statistical estimation., J. Statist. Plann. Inference. 142 (2012), 9, 2574-2585 · Zbl 1253.62013
[6] Csiszár, I., Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit on Markhoffschen Ketten., Publ. Math. Inst. Hungar. Acad. Sci., Ser. A 8 (1963), 84-108 · Zbl 0124.08703
[7] Csiszár, I., Information-type measures of difference of probability distributions and indirect observations., Studia Sci. Math. Hungar. 2 (1967), 299-318 · Zbl 0157.25802
[8] Frýdlová, I.; Vajda, I.; Kůs, V., Modified power divergence estimators in normal model - simulation and comparative study., Kybernetika 48 (2012), 4, 795-808 · Zbl 1318.62012
[9] Gibbs, A. L.; Su, F. E., On choosing and bounding probability metrics., Int. Statist. Rev. 70 (2002), 419-435 · Zbl 1217.62014
[10] Győrfi, L.; Vajda, I.; Meulen, E. C. van der, Family of point estimates yielded by \(L_1\)-consistent density estimate., In: \(L_1\)-Statistical Analysis and Related Methods (Y. Dodge, ed.), Elsevier, Amsterdam 1992, pp. 415-430
[11] Győrfi, L.; Vajda, I.; Meulen, E. C. van der, Minimum Hellinger distance point estimates consistent under weak family regularity., Math. Methods Statist. 3 (1994), 25-45 · Zbl 0824.62028
[12] Győrfi, L.; Vajda, I.; Meulen, E. C. van der, Minimum Kolmogorov distance estimates of parameters and parametrized distributions., Metrika 43 (1996), 237-255 · Zbl 0855.62016
[13] Hrabáková, J.; Kůs, V., The Consistency and Robustness of Modified Cramér-Von Mises and Kolmogorov-Cramér Estimators., Comm. Statist. - Theory and Methods 42 (2013), 20, 3665-3677 · Zbl 1462.62169
[14] Hrabáková, J.; Kůs, V., Notes on consistency of some minimum distance estimators with simulation results., Metrika 80 (2017), 243-257 · Zbl 1360.62234
[15] Kafka, P.; Ősterreicher, F.; Vincze, I., On powers of \(f\)-divergences defining a distance., Studia Sci. Mathem. Hungarica 26 (1991), 415-422 · Zbl 0771.94004
[16] Kůs, V., Blended \(ϕ\)-divergences with examples., Kybernetika 39 (2003), 43-54 · Zbl 1243.62004
[17] Kůs, V., Nonparametric Density Estimates Consistent of the Order of \(n^{-1/2}\) in the \(L_1\)-norm., Metrika 60 (2004), 1-14 · Zbl 1049.62037
[18] Kůs, V.; Morales, D.; Vajda, I., Extensions of the parametric families of divergences used in statistical inference., Kybernetika 44 (2008), 1, 95-112 · Zbl 1142.62002
[19] Cam, L. Le, Asymptotic Methods in Statistical Decision Theory., Springer, New York 1986 · Zbl 0605.62002
[20] Liese, F.; Vajda, I., Convex Statistical Distances., Teubner, Leipzig 1987 · Zbl 0656.62004
[21] Liese, F.; Vajda, I., On divergences and informations in statistics and information theory., IEEE Trans. Inform. Theory 52 (2006), 4394-4412 · Zbl 1287.94025
[22] Matusita, K., Distance and decision rules., Ann. Inst. Statist. Math. 16 (1964), 305-315 · Zbl 0128.38502
[23] Ősterreicher, F., On a class of perimeter-type distances of probability distributions., Kybernetika 32 (1996), 4, 389-393 · Zbl 0897.60015
[24] Pardo, L., Statistical Inference Based on Divergence Measures., Chapman and Hall, Boston 2006 · Zbl 1118.62008
[25] Pfanzagl, J., Parametric Statistical Theory., W. de Gruyter, Berlin 1994 · Zbl 0481.62018
[26] Vajda, I., Theory of Statistical Inference and Information., Kluwer, Boston 1989
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.