×

Rates of convergence of minimum distance estimators and Kolmogorov’s entropy. (English) Zbl 0576.62057

Summary: Let (\({\mathcal H},{\mathcal A})\) be a space with a \(\sigma\)-field, \(M=\{P_ s;s\in \Theta \}\) be a family of probability measures on \({\mathcal A}\) with \(\Theta\) arbitrary, \(X_ 1,...,X_ n\) i.i.d. observations on \(P_{\theta}\). Define \(\mu_ n(A)=(1/n)\sum^{n}_{i=1}I_ A(X_ i),\) the empirical measure indexed by \(A\in {\mathcal A}\). Assume \(\Theta\) is totally bounded when metrized by the \(L_ 1\) distance between measures. Robust minimum distance estimators \({\hat \theta}{}_ n\) are constructed for \(\theta\) and the resulting rate of convergence is shown naturally to depend on an entropy function for \(\Theta\).

MSC:

62G05 Nonparametric estimation
62G30 Order statistics; empirical distribution functions
Full Text: DOI