Heyde, C. C.; Au, K. A cautionary note on model choice and the Kullback-Leibler information. (English) Zbl 1427.62005 J. Stat. Theory Pract. 2, No. 2, 221-232 (2008). Summary: The Kullback-Leibler information has found application in many areas of statistical science. It typically arises in model choice and model dimension questions in a way which suggests its use as a distance. Indeed, it has been widely described as a distance although it comprehensively fails to be a metric. Some pitfalls in interpreting it as a distance are discussed and, in particular, its application to discriminate between prospective risky asset returns distributions. MSC: 62B10 Statistical aspects of information-theoretic topics 62G32 Statistics of extreme values; tail inference 62P05 Applications of statistics to actuarial sciences and financial mathematics 94A17 Measures of information, entropy Keywords:discrimination; entropy; heavy-tailed; statistical distance PDF BibTeX XML Cite \textit{C. C. Heyde} and \textit{K. Au}, J. Stat. Theory Pract. 2, No. 2, 221--232 (2008; Zbl 1427.62005) Full Text: DOI References: [1] Gani, J.; Seneta, E., Stochastic Methods and their Applications: Papers in honour of Chris Heyde (2004) [2] Glasserman, P.; Kou, S., A conversation with Chris Heyde, Statistical Science, 21, 286-298 (2006) · Zbl 1333.01042 This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.