Yatracos, Yannis G. Rates of convergence of minimum distance estimators and Kolmogorov’s entropy. (English) Zbl 0576.62057 Ann. Stat. 13, 768-774 (1985). Summary: Let (\({\mathcal H},{\mathcal A})\) be a space with a \(\sigma\)-field, \(M=\{P_ s;s\in \Theta \}\) be a family of probability measures on \({\mathcal A}\) with \(\Theta\) arbitrary, \(X_ 1,...,X_ n\) i.i.d. observations on \(P_{\theta}\). Define \(\mu_ n(A)=(1/n)\sum^{n}_{i=1}I_ A(X_ i),\) the empirical measure indexed by \(A\in {\mathcal A}\). Assume \(\Theta\) is totally bounded when metrized by the \(L_ 1\) distance between measures. Robust minimum distance estimators \({\hat \theta}{}_ n\) are constructed for \(\theta\) and the resulting rate of convergence is shown naturally to depend on an entropy function for \(\Theta\). Cited in 4 ReviewsCited in 43 Documents MSC: 62G05 Nonparametric estimation 62G30 Order statistics; empirical distribution functions Keywords:Kolmogorov entropy; density estimation; empirical measure; Robust minimum distance estimators; rate of convergence × Cite Format Result Cite Review PDF Full Text: DOI