×

Estimation, prediction and the Stein phenomenon under divergence loss. (English) Zbl 1274.62080

Summary: We consider two problems: (1) estimate a normal mean under a general divergence loss introduced in [S. Amari, Ann. Stat. 10, 357–385 (1982; Zbl 0507.62026); N. Cressie and T. R. C. Read, J. R. Stat. Soc., Ser. B 46, 440–464 (1984; Zbl 0571.62017)] and (2) find a predictive density of a new observation drawn independently of observations sampled from a normal distribution with the same mean but possibly with a different variance under the same loss. The general divergence loss includes as special cases both the Kullback-Leibler and Bhattacharyya-Hellinger losses. The sample mean, which is a Bayes estimator of the population mean under this loss and the improper uniform prior, is shown to be minimax in any arbitrary dimension. A counterpart of this result for predictive density is also proved in any arbitrary dimension. The admissibility of these rules holds in one dimension, and we conjecture that the result is true in two dimensions as well. However, the general Baranchick class of estimators [A. J. Baranchick, “A family of minimax estimators of the mean of a multivariate normal distribution”, Ann. Math. Statist. 41, 642–645 (1970), http://www.jstor.org/stable/2239362], which includes the James-Stein estimator and the Strawderman class of estimators [W. E. Strawderman, Ann. Math. Stat. 42, 385–388 (1971; Zbl 0222.62006)], dominates the sample mean in three or higher dimensions for the estimation problem. An analogous class of predictive densities is defined and any member of this class is shown to dominate the predictive density corresponding to a uniform prior in three or higher dimensions. For the prediction problem, in the special case of Kullback-Leibler loss, our results complement to a certain extent some of the recent important work of F. Komaki [Biometrika 88, No. 3, 859–864 (2001; Zbl 0985.62024)] and E. I. George et al. [Ann. Stat. 34, No. 1, 78–91 (2006; Zbl 1091.62003)]. While our proposed approach produces a general class of predictive densities (not necessarily Bayes, but not excluding Bayes predictors) dominating the predictive density under a uniform prior. We show also that various modifications of the James-Stein estimator continue to dominate the sample mean, and by the duality of estimation and predictive density results which we will show, similar results continue to hold for the prediction problem as well.

MSC:

62C15 Admissibility in statistical decision theory
62C20 Minimax procedures in statistical decision theory
62C12 Empirical decision procedures; empirical Bayes procedures
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Aitchison, J., Goodness of prediction fit, Biometrika, 62, 547-554 (1975) · Zbl 0339.62018
[2] Amari, S., Differential geometry of curved exponential families — curvatures and information loss, Ann. Statist., 10, 357-387 (1982) · Zbl 0507.62026
[3] Baranchick, A. J., A family of minimax estimators of the mean of a multivariate normal distribution, Ann. Math. Statist., 41, 642-645 (1970) · Zbl 0204.52504
[4] Baranchick, A. J., Inadmissibility of maximum likelihood estimators in some multiple regression problems with three or more independent variables, Ann. Statist., 1, 312-321 (1973) · Zbl 0271.62010
[5] Bhattacharyya, A. K., On a measure of divergence between two statistical populations defined by their probability distributions, Bull. Calcutta Math. Soc., 35, 99-109 (1943) · Zbl 0063.00364
[6] Barlow, R. E.; Proschan, F., Statistical Theory of Reliability and Life Testing: Probability Models (1975), Rinehart and Winston, Inc.: Rinehart and Winston, Inc. Holt · Zbl 0379.62080
[7] Blyth, C. R., On minimax statistical decision procedures and their admissibility, Ann. Math. Statist., 22, 22-42 (1951) · Zbl 0042.38303
[8] Brandwein, A. C.; Strawderman, W. E., Minimax estimators of location parameters for spherically symmetric distributions with concave loss, Ann. Statist., 8, 279-284 (1980) · Zbl 0432.62008
[9] Brown, L. D., On the admissibility of invariant estimator of one or more location parameters, Ann. Math. Statist., 38, 1087-1136 (1966) · Zbl 0156.39401
[10] Brown, L. D., Admissible estimators, recurrent diffusions and insoluble boundary value problems, Ann. Math. Statist., 42, 855-903 (1971) · Zbl 0246.62016
[11] Brown, L. D.; Fox, M., Admissibility of procedures in two-dimensional location parameter problems, Ann. Statist., 2, 248-266 (1974) · Zbl 0287.62004
[12] Corcuera, J. M.; Giummole, F., A generalized Bayes rule for prediction, Scand. J. Stat., 26, 265-279 (1999) · Zbl 0934.62027
[13] Cressie, N.; Read, T. R.C., Multinomial goodness-of-fit tests, J. Roy. Statist. Soc. Ser. B, 46, 440-464 (1984) · Zbl 0571.62017
[14] Eaton, M. L., A statistical diptych: Admissible inferences—a recurrence of symmetric Markov chains, Ann. Statist., 20, 1147-1179 (1992) · Zbl 0767.62002
[15] Efron, B.; Morris, C., Stein’s estimation rule and its competitors—an empirical Bayes approach, J. Amer. Statist. Assoc., 68, 117-130 (1973) · Zbl 0275.62005
[16] Faith, R. E., Minimax Bayes and point estimations of a multivariate normal mean, J. Multivariate Anal., 8, 372-379 (1978) · Zbl 0411.62017
[17] Gatsonis, C., Deriving posterior distributions for a location parameter: A decision theoretic approach, Ann. Statist., 3, 958-970 (1984) · Zbl 0544.62008
[18] George, E. I.; Liang, F.; Xu, X., Improved minimax predictive dencities under Kullbak-Leibler loss, Ann. Statist., 34, 78-92 (2006)
[19] Hellinger, E., Neue Begründung der Theorie quadratischen Formen von unendlichen vielen Veränderlichen, J. Reine Angewandte Math., 136, 210-271 (1909) · JFM 40.0393.01
[20] Hodges, J. L.; Lehmann, E. L., Some problems in minimax point estimation, Ann. Math. Statist., 21, 182-197 (1950) · Zbl 0038.09802
[21] James, W.; Stein, C., Estimation with quadratic loss, (Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, vol. 1 (1961), University of California Press), 361-380 · Zbl 1281.62026
[22] Komaki, F., A shrinkage predictive distribution for multivariate normal observations, Biometrika, 88, 859-864 (2001) · Zbl 0985.62024
[23] Lindley, D. V., Discussion of Professor Stein’s paper ‘Confidence sets for the mean of a multivariate distribution’, J. Roy. Statist. Soc. Ser. B., 24, 265-296 (1962)
[24] Robert, C. P., The Bayesian Choice (2001), Springer-Verlag: Springer-Verlag New York
[25] Strawderman, W. E., Proper Bayes minimax estimators of the multivariate normal mean, Ann. Math. Statist., 42, 385-388 (1971) · Zbl 0222.62006
[26] Stein, C., Inadmissibility of the usual estimator for the mean of a multivariate normal distribution, (Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability (1955), University of California Press: University of California Press Berkeley and Los Angeles), 197-206
[27] Stein, C., The admissibility of Pitman’s estimator for a single location parameter, Ann. Math. Statist., 30, 970-979 (1959) · Zbl 0087.15101
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.