×

Motivation, existence and equivariance of D-estimators. (English) Zbl 0558.62026

Author’s introduction: This is the first in a series of papers on D- estimators to be published in Kybernetika. D-estimators are minimizing f- divergence or properly modified f-divergence between theoretical and empirical probability. Suitable specifications of convex functions f yield either new promising estimators, or well-known estimators such as the MLE, or M-estimators, or various minimum distance estimators, motivated so far quite diversely if motivated at all.
The theory of D-estimators can be considered as an alternative to the loss-function-based theory in a systematic development of asymptotic as well as non-asymptotic properties of wide classes of estimators. The present paper is devoted to motivation and examples of D-estimators and to non-asymptotic aspects of the theory such as existence, measurability, continuity, invariance, and equivariance of D-estimators.
Reviewer: P.Ressel

MSC:

62F10 Point estimation
62F12 Asymptotic properties of parametric estimators
62B10 Statistical aspects of information-theoretic topics
PDFBibTeX XMLCite
Full Text: EuDML

References:

[1] J. Aczél: Lectures on Functional Equations and Their Applications. Academic Press, New York 1966. · Zbl 0139.09301
[2] D. F. Andrews P. J. Bickel R. R. Hampel P. J. Huber W. H. Rogers, J. W. Tukey: Robust Estimates of Location. Princeton Univ. Press, Princeton, N. J. 1972. · Zbl 0254.62001 · doi:10.1515/9781400867011
[3] D. E. Boekee: The \(D_f\)-information of order s. Trans. 8th Prague Conf. on Inform. Theory, etc., Vol. C, Academia, Prague 1979, 55-68.
[4] D. D. Boos: Minimum distance estimators for location and goodness of fit. J. Amer. Statist. Assoc. 76 (1981), 663-670. · Zbl 0475.62030 · doi:10.2307/2287527
[5] I. Csiszár: Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten. Publ. Math. Inst. Hungar. Acad. Sci. Ser. AS (1963), 85-108. · Zbl 0124.08703
[6] I. Csiszár: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2, (1967) 209-318. · Zbl 0157.25802
[7] H. Cramér: Mathematical Methods of Statistics. Princeton Univ. Press, Princeton, N. J. 1946. · Zbl 0063.01014
[8] P. I. Huber: Robust estimation of a location parameter. Ann. Math. Statist. 35 (1964), 73-101. · Zbl 0136.39805 · doi:10.1214/aoms/1177703732
[9] P. I. Huber: Robust statistics: a review. Ann. Math. Statist. 43 (1972), 1041-1067. · Zbl 0254.62023 · doi:10.1214/aoms/1177692459
[10] S. Kullback, R. A. Leibler: On information and sufficiency. Ann. Math. Statist. 22 (1951), 79-86. · Zbl 0042.38403 · doi:10.1214/aoms/1177729694
[11] L. Le Cam: On the information contained in additional observations. Ann. Statist. 2 (1974), 630-649. · Zbl 0286.62004 · doi:10.1214/aos/1176342753
[12] R. Š. Lipcer, A. N. Širjaev: Statistics of Random Processes. (in Russian). Nauka, Moscow 1974.
[13] P. W. Millar: Robust estimation via minimum distance methods. Z. Wahrsch. verw. Gebiete 55 (1981), 73-89. · Zbl 0461.62036 · doi:10.1007/BF01013462
[14] A. M. Mood F. A. Graybill, D. C. Boes: Introduction to the Theory of Statistics. McGraw-Hill, New York 1963. · Zbl 0277.62002
[15] J. Neyman: Contributions to the theory of \(\chi^2\)-test. Proc. 1st Berkeley Symp. on Math. Statist., etc., Univ. of Calif. Press, Berkeley 1949, 239-273.
[16] W. C. Parr, W. R. Schucany: Minimum distance and robust estimation. J. Amer. Statist. Assoc. 75 (1980), 616-624. · Zbl 0481.62031 · doi:10.2307/2287658
[17] C. R. Rao: Asymptotic efficiency and limiting information. Proc. 4th Berkeley Symp. on Math. Statist., etc., Vol. 1, Univ. of Calif. Press, Berkeley 1961, 531-546. · Zbl 0156.39802
[18] C. R. Rao: Criteria of estimation in large samples. Sankhya 25 (1963), 189-206. · Zbl 0268.62011
[19] P. V. Rao, al.: Estimation of shift and center of symmetry based on Kolmogorov-Smirnov statistic. Ann. Statist. 3 (1975), 862-873. · Zbl 0313.62026 · doi:10.1214/aos/1176343187
[20] I. Vajda: Limit theorems for total variation of Cartesian product measures. Studia Sci. Math. Hungar. 6 (1971), 317-333. · Zbl 0243.62034
[21] I. Vajda: On the f-divergence and singularity of probability measures. Period. Math. Hungar. 2 (1972), 223-234. · Zbl 0248.62001 · doi:10.1007/BF02018663
[22] I. Vajda: \(\chi^\alpha\)-divergence and generalized Fisher information. Trans. 6th Prague Conf. on Inform. Theory, etc., Academia, Prague 1973, 873 - 886. · Zbl 0297.62003
[23] I. Vajda: Theory of Information and Statistical Decision. (in Slovak), Alfa, Bratislava 1981.
[24] I. Vajda: A new general approach to minimum distance estimation. Trans. 9th Prague Conf. on Inform. Theory, etc., Vol. C, Academia, Prague 1983. · Zbl 0552.62016
[25] I. Vajda: Minimum divergence principle in statistical estimation. Statistics and Decisions · Zbl 0558.62004
[26] M. Vošvrda: On second order efficiency of minimum divergence estimators. Trans. 9th Prague Conf. on Inform. Theory, etc., Vol. C, Academia, Prague 1983. · Zbl 0552.62017
[27] J. Wolfowitz: The minimum distance method. Ann. Math. Statist. 28 (1957), 75-88. · Zbl 0086.35403 · doi:10.1214/aoms/1177707038
[28] P. W. Zehna: Invariance of maximum likelihood estimation. Ann. Math. Statist. 37 (1966), 755. · Zbl 0138.13203
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.