# zbMATH — the first resource for mathematics

Minimum divergence principle in statistical estimation. (English) Zbl 0558.62004
Recent results in estimation theory and related topics, Suppl. Issues Stat. Decis. 1, 239-261 (1984).
[For the entire collection see Zbl 0542.00015.]
Statistical estimators (D-estimators) minimizing f-divergence between theoretical probability $$P_{\theta}\in {\mathcal P}_{\Theta}$$ and empirical probability $$P_ n$$ on a measurable space (X,$${\mathcal B})$$ are introduced. Suitable specifications of a convex function f yield new promising estimators or well-known so far quite diversely motivated ones. Examples are the MLE and other M-estimators, the minimum distance estimators considered in the literature as well as certain ”smooth approximations” to A17-A25 emerging from the Princeton robustness study.
The class of D-estimators is homogeneous enough to be studied on the whole. In this sense, the minimum divergence approach is a possible alternative to the loss function approach in a systematic development of statistical estimation theory. The present paper is restricted to non- asymptotic aspects of the theory with emphasis on motivation, examples, existence and measurability of D-estimators. In particular, it is shown that under quite general regularity conditions the D-estimates are continuous functions of sample vectors.

##### MSC:
 62A01 Foundations and philosophical topics in statistics 62F10 Point estimation