A note on Hammersley’s inequality for estimating the normal integer mean. (English) Zbl 1022.62018

Summary: Let \(X_{1}, X_{2},\dotsc,X_{n}\) be a random sample from a normal \(N(\theta,\sigma^2)\) distribution with an unknown mean \(\theta = 0,\pm 1, \pm 2,\ldots\). J. M. Hammersley [J. R. Stat. Soc., Ser. B 12, 192-240 (1950; Zbl 0040.22202)] proposed the maximum likelihood estimator (MLE) \(d =[\overline{X}_n]\), nearest integer to the sample mean, as an unbiased estimator of \(\theta\) and extended the Cramér-Rao inequality. The Hammersley lower bound for the variance of any unbiased estimator of \(\theta\) is significantly improved, and the asymptotic (as \(n\rightarrow\infty\)) limit of Fraser-Guttman-Bhattacharyya bounds is also determined. A limiting property of a suitable distance is used to give some plausible explanations why such bounds cannot be attained. An almost uniformly minimum variance unbiased (UMVU) like property of \(d\) is exhibited.


62F10 Point estimation
62F12 Asymptotic properties of parametric estimators


Zbl 0040.22202
Full Text: DOI EuDML