zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
Intrinsic losses. (English) Zbl 0848.90010

Summary: Since the choice of a particular loss function strongly influences the resulting inference, it seems necessary to rely on “intrinsic” losses when no information is available about the utility function of the decision-maker, rather than to call for classical losses like the squared error loss. Since this setting is quite similar to the derivation of noninformative priors in Bayesian analysis, we first recall the conditions of this derivation and deduce from these conditions some requirements on the intrinsic losses. It then appears that these loss functions should only depend on the sampling distribution and that they should be independent of the parametrization of the distribution. The resulting estimators are therefore transformation equivariant. We study the properties of two natural intrinsic losses, namely entropy and Hellinger losses, and show that they can be expressed in closed form for exponential families.

Moreover, the entropy loss also provides analytic expressions of Bayes estimators under conjugate priors; the derivation of Bayes estimators associated with the Hellinger loss is more cumbersome, as shown in Poisson and Gamma cases, while leading to similar estimators.

MSC:
91B16Utility theory
62J99Linear statistical inference
62F15Bayesian inference
62C10Bayesian problems; characterization of Bayes procedures
62C15Statistical admissibility
References:
[1]Abramowitz, M. and Stegun, I.: 1964, Handbook of Mathematical Functions, Dover.
[2]Berger, J.O.: 1985, Statistical Decision Theory and Bayesian Analysis (2nd edition). Springer-Verlag, New York.
[3]Berger, J.O. and Bernardo, J.M.: 1989, ?Estimating a product of means Bayesian analysis with reference priors?, J. Amer. Statist. Assoc. 84, 200-207. · Zbl 0682.62018 · doi:10.2307/2289864
[4]Berger, J.O. and Wolpert, R.: 1988, The Likelihood Principle (2nd edition). IMS Lecture Notes, Monograph Series 6, Hay ward, California.
[5]Bernardo, J.M. 1979, ?Reference posterior distributions for Bayesian inference (with discussion)?, J. Royal Statist. Soc. (Ser. B) 41, 113-147.
[6]Bernardo, J.M. and Smith, A.F.M.: 1994, Bayesian Theory, J. Wiley, New York.
[7]Birgé, L.: 1980, Approximation dans les espaces métriques et théorie de l’estimation, Thèse d’Etat, Univ. de Paris VII
[8]Birgé, L.: 1983, ?Robust testing for independent non identically distributed variables and Markov chains?, Lecture Notes in Statistics 16, Springer-Verlag, New York.
[9]Brown, L.D.: 1986, Foundations of Exponential Families, IMS Lecture Notes, Monograph Series 9, Hay ward, California.
[10]Brown, L.D.: 1980, ?Examples of Berger’s phenomenon in the estimation of independent normal means?, Ann. Statist. 9, 1289-1300. · Zbl 0476.62006 · doi:10.1214/aos/1176345645
[11]DeGroot, M.: 1970, Optimal Statistical Decisions, McGraw-Hill, New York.
[12]Fishburn, P.: 1988, Non-linear Preferences and Utility Theory, Harvester Wheatsheaf, Brighton, Sussex.
[13]Gau?, C.F.: 1810, Méthode des moindres carrés. Mémoire sur la combination des observations, Trad. J. Bertrand. Mallet-Bachelier, Paris (1985).
[14]Gibbs, J.W.: 1876, Collected Works and Commentary, vol. II, A. Haas (Ed.) Yale University Press, New Haven (1936).
[15]Gouriéroux, C. and Monfort, A.: 1994, ?Testing non-nested hypotheses?, Handbook of Econometrics, Vol. IV, R.F. Engle and D.L. McFadden (Eds.), Elsevier, Amsterdam.
[16]Gutiérrez-Peña, E.: 1992, ?Expected logarithmic divergence for exponential families?, in Bayesian Statistics 4, J.O. Bernardo, A.P. Dawid and A.F.M. Smith (eds.). Oxford University Press, London, pp. 669-674.
[17]Huber, P.: 1964, ?Robust estimation of a location parameter?, Ann. Math. Statist. 35, 73-101. · Zbl 0136.39805 · doi:10.1214/aoms/1177703732
[18]Jaynes, E.t.: 1989, Papers on Probability, Statistics and Statistical Physics, R.D. Rosenkrantz (Ed.), Kluwer Academic Publishers, Dordrecht.
[19]Jeffreys, H.: 1961, Theory of Probability (3rd edition), Oxford University Press, London.
[20]Keating, J.P., Mason, R.L. and Sen, P.K.: 1993, Pitman Measure of Closeness: Comparison of Statistical Estimators, SIAM, Philadelphia.
[21]Le Cam, L.: 1982, ?On the risk of Bayes estimates?, in Statistical Decision Theory and Related Topics III, Vol. 2, J.L. Berger and S.S. Gupta (Eds.) Academic Press, New York.
[22]Le Cam, L.: 1986, Asymptotic Methods in Statistical Decision Theory, Springer-Verlag, New York.
[23]Le Cam, L. and Yang, G.L.: 1990, Asymptotics in Statistics, Springer-Verlag, New York.
[24]Lehmann, E. Lehmann, E.L.: 1993, Theory of Point Estimation J. Wiley, New York.
[25]Lehmann, E.L. and Casella, G.: 1995, Theory of Point Estimation (revised edition). Wadsworth, Pacific Grove, California.
[26]Lindley, D.: 1985, Making Decisions, Wiley, New York.
[27]McCulloch, R. and Rossi, P.E.: 1992, ?Bayes factors for nonlinear hypotheses and likelihood distributions?, Biometrika 79, 663-676. · Zbl 0850.62284 · doi:10.1093/biomet/79.4.663
[28]Olver, F.W.J.: 1974, Asymptotics and Special Functions, Academic-Press, New York.
[29]Pitman, E.: 1937, ?The closest estimates of statistical parameters?, Proc. Cambridge Phil. Soc. 33, 212-222. · Zbl 02521302 · doi:10.1017/S0305004100019563
[30]Rao, C.R.: 1981, ?Some comments on the minimum mean square error as a criterion of estimation?, in Statistics and Related Topics, M. Csörgo, D. Dawson, J.N.K. Rao, and A. Saleh (Eds.), pp. 123-143.
[31]Robert, C.P.: 1994, The Bayesian Choice, Springer-Verlag, New York.
[32]Robert, C.P., Hwang, J.T.G. and Strawderman, W.E.: 1993, ?Is Pitman nearness a reasonable criterion? (with discussion)?, J. American Statistical Assoc. 88(1), 57-76. · Zbl 0779.62020 · doi:10.2307/2290692
[33]Rubin, H.: 1987, ?A weak system of axioms for ?rational? behavior and the nonseparability of utility from prior?, Statist. Decisions 5, 47-58.
[34]Rukhin, A.: 1978, ?Universal Bayes estimators?, Ann. Statist. 6, 345-351. · Zbl 0392.62023 · doi:10.1214/aos/1176344379
[35]Shannon, C.: 1948, ?A mathematical theory of communication?, Bell System Tech. J. 27, 379-423 and 623-656.
[36]Stigler, S.: 1986, The History of Statistics Belknap, Harvard.