×

\(\lambda\)-measures of hypoentropy and their applications. (English) Zbl 0616.62008

C. Ferreri [ibid. 40, 155-168 (1980; Zbl 0454.62004)] defined the \(\lambda\)-measures of hypoentropy, the \(\lambda\)-mutual information and the \(\lambda\)-discrimination function, as generalizations of the well- known Shannon entropy, Shannon mutual information and Kullback-Leibler discrimination function, respectively (which become the respective limits as \(\lambda\) tends to \(\infty)\). In the present paper the author first studies properties of these measures and establishes an alternative generalization of Shannon’s mutual information in terms of the \(\lambda\)- discrimination function that satisfies interesting properties of the Shannon one (which are not conserved for the extension in Ferreri’s paper). The \(\lambda\)-divergence is immediately derived. Then, the preceding measures are applied to develop results on the Hamming distance, Markov chains, Information Geometry, and bounds for the probability of error to be used in pattern recognition. Following D. Blackwell [see Proc. Berkeley Sympos. math. Statist. Probability, 1950, 93-102 (1951; Zbl 0044.142)], P. K. Goel and M. H. DeGroot [Ann. Stat. 7, 1066-1077 (1979; Zbl 0412.62004)], two criteria to compare experiments based on the \(\lambda\)-discrimination function and \(\lambda\)- divergence are established and the relation with the Blackwell one is later examined. Finally, a parametric measure of information is proposed from the \(\lambda\)-discrimination function and its connection with the Fisher amount of information is stated.
Reviewer: M.A.Gil Alvarez

MSC:

62B10 Statistical aspects of information-theoretic topics
94A17 Measures of information, entropy
62B15 Theory of statistical experiments
94A15 Information theory (general)
PDF BibTeX XML Cite