zbMATH — the first resource for mathematics

Minimum disparity estimators for discrete and continuous models. (English) Zbl 1059.62001
The paper presents a concept of minimum disparity estimators. The idea starts with \(D_{\varphi }(p;q)=\sum _{i=1}^nq_i\varphi(p_i/q_i)\), called \(\varphi \)-disparity of \(p\) and \(q\), provided \(\varphi \) is unimodal with zero-minimum in \(1\), two-times differentiable, and convex about \(1\). The authors are proving strong consistency of minimum \(\varphi \)-disparity estimators. The explained theory is supplied with examples, and applications to discrete and continuous models are also discussed.

62B10 Statistical aspects of information-theoretic topics
62E20 Asymptotic distribution theory in statistics
Full Text: DOI EuDML
[1] A. Basu, S. Sarkar: Minimum disparity estimation in the errors-in-variables model. Statist. Probab. Lett. 20 (1994), 69-73. · Zbl 0925.62149 · doi:10.1016/0167-7152(94)90236-4
[2] A. Basu, S. Sarkar: The trade-off between robustness and efficiency and the effect of model smoothing. J. Statist. Comput. Simulation 50 (1994), 173-185. · doi:10.1080/00949659408811609
[3] M. W. Birch: A new proof of the Pearson-Fisher theorem. Ann. Math. Statist. 35 (1964), 817-824. · Zbl 0259.62017 · doi:10.1214/aoms/1177703581
[4] E. Bofinger: Goodness-of-fit using sample quantiles. J. Roy. Statist. Soc. Ser. B 35 (1973), 277-284. · Zbl 0263.62029
[5] H. Cramér: Mathematical Methods of Statistics. Princeton University Press, Princeton, 1946. · Zbl 0063.01014
[6] N. A. C. Cressie, R. C. Read: Multinomial goodness-of-fit tests. J. Roy. Statist. Soc. Ser. B 46 (1984), 440-464. · Zbl 0571.62017
[7] R. A. Fisher: Statistical Methods for Research Workers (8th edition). London, 1941.
[8] F. Liese, I. Vajda: Convex Statistical Distances. Teubner, Leipzig, 1987. · Zbl 0656.62004
[9] B. G. Lindsay: Efficiency versus robutness: The case for minimum Hellinger distance and other methods. Ann. Statist. 22 (1994), 1081-1114. · Zbl 0807.62030 · doi:10.1214/aos/1176325512
[10] M. L. Menéndez, D. Morales, L. Pardo and I. Vajda: Two approaches to grouping of data and related disparity statistics. Comm. Statist. Theory Methods 27 (1998), 609-633. · Zbl 1126.62300 · doi:10.1080/03610929808832117
[11] M. L. Menéndez, D. Morales, L. Pardo and I. Vajda: Minimum divergence estimators based on grouped data. Ann. Inst. Statist. Math. 53 (2001), 277-288. · Zbl 1027.62011 · doi:10.1023/A:1012466605316
[12] D. Morales, L. Pardo and I. Vajda: Asymptotic divergence of estimates of discrete distributions. J. Statist. Plann. Inference 48 (1995), 347-369. · Zbl 0839.62004 · doi:10.1016/0378-3758(95)00013-Y
[13] J. Neyman: Contribution to the theory of the \(\chi ^2\) test. Proc. Berkeley Symp. Math. Statist. Probab., Berkeley, CA, Berkeley University Press, Berkeley, 1949, pp. 239-273. · Zbl 0039.14302
[14] Ch. Park, A. Basu and S. Basu: Robust minimum distance inference based on combined distances. Comm. Statist. Simulation Comput. 24 (1995), 653-673. · Zbl 0850.62243 · doi:10.1080/03610919508813265
[15] C. R. Rao: Asymptotic efficiency and limiting information. Proc. 4th Berkeley Symp. Math. Stat. Probab., Berkeley, CA, Berkeley University Press, Berkeley, 1961, pp. 531-545. · Zbl 0156.39802
[16] C. R. Rao: Linear Statistical Inference and its Applications (2nd edition). Wiley, New York, 1973. · Zbl 0256.62002
[17] R. C. Read, N. A. C. Cressie: Goodness-of-fit Statistics for Discrete Multivariate Data. Springer-Verlag, New York, 1988. · Zbl 0663.62065
[18] C. A. Robertson: On minimum discrepancy estimators. Sankhyä Ser. A 34 (1972), 133-144. · Zbl 0266.62021
[19] I. Vajda: \(\chi ^2\)-divergence and generalized Fisher information. Transactions of the Sixth Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Academia, Prague, 1973, pp. 223-234. · Zbl 0297.62003
[20] I. Vajda: Theory of Statistical Inference and Information. Kluwer Academic Publishers, Boston, 1989. · Zbl 0711.62002
[21] B. L. van der Waarden: Mathematische Statistik. Springer-Verlag, Berlin, 1957. · Zbl 0077.12901
[22] K. Pearson: On the criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling. Philosophical Magazine 50 (1990), 157-172. · JFM 31.0238.04
[23] I. Csiszár: Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten. Publications of the Mathematical Institute of the Hungarian Academy of Sciences, Series A 8 (1963), 85-108. · Zbl 0124.08703
[24] M. S. Ali, S. D. Silvey: A general class of coefficients of divergence of one distribution from another. J. Roy. Statist. Soc. Ser. B 28 (1966), 131-140. · Zbl 0203.19902
[25] A. Rényi: On measures of entropy and information. Proceedings of the 4th Berkeley Symposium on Probability Theory and Mathematical Statistics, Vol. 1, University of California Press, Berkeley, 1961, pp. 531-546.
[26] A. W. Marshall, I. Olkin: Inequalities: Theory of Majorization and its Applications. Academic Press, New York, 1979. · Zbl 0437.26007
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.