Vajda, I. On the f-divergence and singularity of probability measures. (English) Zbl 0248.62001 Period. Math. Hung. 2, 223-234 (1972). Page: −5 −4 −3 −2 −1 ±0 +1 +2 +3 +4 +5 Show Scanned Page Cited in 1 ReviewCited in 31 Documents MSC: 62A01 Foundations and philosophical topics in statistics 62F03 Parametric hypothesis testing 62B10 Statistical aspects of information-theoretic topics 94A15 Information theory (general) PDF BibTeX XML Cite \textit{I. Vajda}, Period. Math. Hung. 2, 223--234 (1972; Zbl 0248.62001) Full Text: DOI References: [1] I. Csiszár, Eine informationstheoretische Ungleichung, und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten,Publ. Math. Inst. Hungar. Acad. Sci. 8 (1963), Ser. A, 85–108. · Zbl 0124.08703 [2] I. Csiszár, Information-type measures of difference of probability distributions and indirect observations,Studia Sci. Math. Hungar. 2 (1967), 299–318. · Zbl 0157.25802 [3] M. S. Pinsker,Information and information stability of random variables and processes, Moscow, 1960 (in Russian). · Zbl 0104.36702 [4] S. Kullback,Information theory and statistics, New York, 1959. · Zbl 0088.10406 [5] H. Hahn, Über die Integrale des Herrn Hellinger und die Orthogonalinvarianten der quadratischen Formen von unendlich vielen Veränderlichen,Monatsh. Math. Phys. 23 (1912), 161–224. · JFM 43.0421.03 [6] A. M. Kagan, Towards the theory of Fisher’s amount of information,Dokl. Akad. Nauk SSSR 151 (1963), 277–278 (in Russian). · Zbl 0138.14902 [7] I. Vajda, On the amount of information contained in a sequence of independent observationsKybernetika (Prague)6 (1970), 306–323. · Zbl 0202.17802 [8] A. Rényi, On the foundations of information theory,Rev. ISI 33 (1965), 1–14. · Zbl 0161.16903 [9] A. Rényi, On measures of entropy and information,Proc. 4th Berkeley Symp. on Math. Stat. and Prob., Vol. I, Berkeley, 1960, 547–561. [10] G. H. Hardy, J. E. Littlewood andG. Pólya,Inequalities, Cambridge, 1959. [11] S. Kullback, A lower bound for discrimination information in terms of variation,IEEE Trans. Information Theory 13 (1967), 126–127. [12] P. R. Halmos,Measure theory, New York, 1966. [13] J. L. Doob,Stochastic processes, New York, 1953. · Zbl 0053.26802 [14] A. Perez, Notions généralisées d’incertitude, d’entropie et d’information du point de vue de la théorie des martingales,Trans. 1st Prague Conf. on Information Theory, Prague, 1957, 193–208. [15] T. E. Duncan, On the absolute continuity of measuresAnn. Math. Statist. 41 (1970), 30–38. · Zbl 0191.46705 [16] I. Vajda, Limit theorems for total variation of Cartesian product measures,Studia Sci. Math. Hungar. 6 (1971), 317–333. · Zbl 0243.62034 [17] S. Kakutani, On equivalence of infinite product measures,Ann. of Math. 49 (1948), 214–226. · Zbl 0030.02303 [18] J. Hájek, On a property of normal distributions of an arbitrary stochastic process,Czechoslovak Math. J. 8 (1958), 610–618. · Zbl 0086.33503 [19] I. Vajda, Note on discrimination information and variation,IEEE Trans. Information Theory 16 (1970), 771–773. · Zbl 0206.21001 This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.