×

zbMATH — the first resource for mathematics

Generalization of discrimination-rate theorems of Chernoff and Stein. (English) Zbl 0727.62026
The author considers a simple hypothesis and alternative about an abstract parametrized random observation from a directed set. The asymptotics over this set are evaluated for mixed errors of the Bayes test and second kind errors of the Neyman-Pearson tests.
By using simple properties of Rényi distances established by F. Liese and the author [Convex statistical distances. (1987; Zbl 0656.62004)], and from elementary inequalities established for Rényi distances by O. Krafft and D. Plachky [Ann. Math. Stat. 41, 1646-1654 (1970; Zbl 0214.180)] and the author [Studia Sci. Math. Hungar. 6(1971), 317-333 (1972; Zbl 0243.62034)], he shows that extensions to the observations consisting of segments of random sequences, processes and fields, essentially depend on the existence of an asymptotic Rényi distance of the hypothesis and alternative. This distance describes the discrimination rates attained by Bayes and Neyman- Pearson tests.

MSC:
62F03 Parametric hypothesis testing
62F15 Bayesian inference
62B10 Statistical aspects of information-theoretic topics
62B99 Sufficiency and information
62F05 Asymptotic properties of parametric tests
PDF BibTeX XML Cite
Full Text: Link EuDML
References:
[1] P. H. Algoet, T.M. Cover: A sandwich proof of the Shannon-McMillan-Breiman theorem. Ann. Probab. 16 (1988), 899-909. · Zbl 0653.28013 · doi:10.1214/aop/1176991794
[2] A. Bhattacharyya: On some analogous to the amount of information and their uses in statistical estimation. Sankhya 8 (1946), 1-14.
[3] H. Chernoff: A measure of asymptotic efficiency for tests of a hypothesis based on a sum of observations. Ann. Math. Statist. 23 (1952), 493-507. · Zbl 0048.11804 · doi:10.1214/aoms/1177729330
[4] H. Chernoff: Large sample theory: Parametric case. Ann. Math. Statist. 27 (1956), 1 - 22. · Zbl 0072.35703 · doi:10.1214/aoms/1177728347
[5] R. S. Ellis: Large deviations for a general class of random vectors. Ann. Probab. 12 (1984), 1-12. · Zbl 0534.60026 · doi:10.1214/aop/1176993370
[6] R. S. Ellis: Entropy, Large Deviations, and Statistical Mechanics. Springer-Verlag. Berlin-Heidelberg-New York 1985. · Zbl 0566.60097
[7] I. I. Gikhman, A. V. Skorokhod: Stochastic Differential Equations (in Russian). Naukova Dumka, Kiev 1986.
[8] J. Hájek: A property of J-divergences or marginal probability distributions. Czechoslovak Math. J. 8 (1958), 460-463. · Zbl 0082.34103
[9] J. Hájek: On a property of normal distributions of an arbitrary stochastic process. Czechoslovak Math. J. 8 (1958), 610-618. · Zbl 0086.33503 · eudml:11961
[10] A. Janssen: Asymptotic properties of Neyman-Pearson test for infinite Kullback-Leibler information. Ann. Statist. 14 (1986), 1068-1079. · Zbl 0632.62023 · doi:10.1214/aos/1176350050
[11] M. Janžura: Divergences of Gauss-Markov random fields with application to statistical inference. Kybernetika 24 (1988), 6, 401 - 412. · Zbl 0664.62086 · www.kybernetika.cz · eudml:27339
[12] S. Kakutani: On equivalence of infinite product measures. Ann. of Math. 49 (1948), 214 - 226. · Zbl 0030.02303 · doi:10.2307/1969123
[13] E. I. Kolomietz: Asymptotic behaviour of type II errors of Neyman-Pearson test (in Russian). Teor. Veroyatnost. i Primenen. 33 (1986), 503 - 522.
[14] S. H. Koopmans: Asymptotic rate of discrimination for Markov processes. Ann. Math. Statist. 31 (1960), 982-994. · Zbl 0096.12603 · doi:10.1214/aoms/1177705671
[15] O. Kraft, D. Plachky: Bounds for the power of likelihood ratio test and their asymptotic properties. Ann. Math. Statist. 41 (1970), 1646-1654. · Zbl 0214.18003
[16] S. Kullback J. C Keegel, and J. H. Kullback: Topics in Statistical Information Theory. Springer-Verlag, Berlin-Heidelberg-New York 1987. · Zbl 0632.62003
[17] H. Künsch: Thermodynamics and statistical analysis of Gaussian random fields. Z. Wahrsch. verw. Geb. 58 (1981) 407-421. · Zbl 0458.60053
[18] E. L. Lehman: Testing Statistical Hypotheses. J. Wiley, New York 1959. · Zbl 0089.14102
[19] F. Liese: Hellinger integrals of diffusion processes. Statistics 17 (1986), 63-78. · Zbl 0598.60042 · doi:10.1080/02331888608801912
[20] F. Liese, I. Vajda: Convex Statistical Distances. Teubner, Leipzig 1987. · Zbl 0656.62004
[21] J. Mémin, A. N. Shiryayev: Distances de Hellinger-Kakutani des lois correspondant á acroissements indépendants. Z. Wahrsch. verw. Geb. 70 (1985), 67-89. · Zbl 0569.60038
[22] T. Nemetz: On the \(\alpha\)-divergence rate for Markov-dependent hypotheses. Problems Control Inform. Theory 5(1974), 147-155. · Zbl 0303.62005
[23] C M. Newman: The inner product of path space measures corresponding to random processes with independent increments. Bull. Amer. Math. Soc. 78 (1982), 268 - 272. · Zbl 0233.60041 · doi:10.1090/S0002-9904-1972-12952-5
[24] C M. Newman: The orthogonality of independent increment processes. Topics in Probability Theory (D. W. Strook, S. R. S. Varadhan, Convent Inst, of Math. Sciences, New York 1973, pp. 93-111. · Zbl 0269.60055
[25] C M. Newman, B. W. Stuck: Chernoff bounds for discriminating between two Markov processes. Stochastics 2 (1979), 139-153. · Zbl 0408.62070 · doi:10.1080/17442507908833121
[26] J. Oosterhooff, W. R. van Zwett: A note on contiguity and Hellinger distance. Contributions to Statistics (J. Hájek Memorial Volume, J. Jurečková, Reidel, Dordrecht 1979.
[27] A. Perez: Notions generalises d’incertitude, d’entropie et d’information du point de vue de la theorie de martingales. Trans. 1st Prague Conf. on Inform. Theory, Statist. Dec. Functions, Random Processes. Publ. House Czechosl. Acad. Sci., Prague 1957, pp. 183 -206.
[28] A. Perez: Generalization of ChernofTs result on the asymptote discernibility of two random processes. Colloq. Math. Soc. Janos Bolyai 9 (1974), 619 - 632.
[29] M. S. Pinsker: Information and Information Stability of Random Variables and Processes. Holden-Day, San Francisco 1964. · Zbl 0125.09202
[30] A. Rényi: On measures of entropy and information. Proc. 4th Berkeley Symp. on Probab. Theory Math. Statist., Vol. 1, Berkeley Univ. Press, Berkeley 1961, pp. 447-561.
[31] I. Vajda: Limit theorems for total variation of cartesian product measures. Studia Sci. Math. Hungar. 6 (1971), 317-333. · Zbl 0243.62034
[32] I. Vajda: Theory of Statistical Inference and Information. Kluwer, Dordrecht-Boston 1989. · Zbl 0711.62002
[33] A. Wald: Statistical Decision Functions. J. Wiley, New York 1961. · Zbl 0034.22804 · doi:10.1214/aoms/1177730030
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.