×

Convergence rates for empirical Bayes estimation in the uniform U(0,\(\theta\) ) distribution. (English) Zbl 0649.62003

Let \(\{(X_ i,\theta_ i)\}\) be a sequence of independent random vectors where \(X_ i\) has a uniform density \(U(0,\theta_ i)\) for \(0<\theta_ i<m\) \((<\infty)\) and the unobservable \(\theta_ i\) are i.i.d. G in some class \({\mathcal G}\) of prior distributions. In the \((n+1)st\) problem we estimate \(\theta_{n+1}\) by \(t_ n(X_ 1,...,X_ n,X_{n+1})\doteq t_ n(X)\), incurring the risk \(R_ n\doteq E(t_ n(X)-\theta_{n+1})^ 2\), where E denotes expectation with respect to all random variables \(\{(X_ i,\theta_ i)\}^{n+1}_{i=1}\). Let R be the infimum Bayes risk with respect to G.
In this paper the author exhibits empirical Bayes estimators with a convergence rate \(O(n^{-})\) of \(R_ n-R\) and shows that there is a sequence of empirical Bayes estimators for which \(R_ n-R\) has a lower bound of the same order \(n^{-}\).

MSC:

62C12 Empirical decision procedures; empirical Bayes procedures
62F10 Point estimation
62C25 Compound decision problems in statistical decision theory
62F15 Bayesian inference
Full Text: DOI