×

On asymptotic sufficiency and optimality of quantizations. (English) Zbl 1098.62004

Summary: It is known that quantizations of primary sources of information reduce the information available for statistical inference. We are interested in the quantizations for which the loss of statistical information can be controlled by the number of cells in the observation space used to quantize observations. If the losses for increasing numbers of cells converge to zero then we speak about asymptotically sufficient quantizations.
Optimality is treated on the basis of rate of this convergence. The attention is restricted to the models with continuous real-valued observations and to the interval partitions. We give easily verifiable necessary and sufficient conditions for the asymptotic sufficiency and, for a most common measure of statistical information, we study also the rate of convergence to the information in the original non-quantized models. Applications of the results in concrete models are illustrated by examples.

MSC:

62B10 Statistical aspects of information-theoretic topics
62A01 Foundations and philosophical topics in statistics
94A17 Measures of information, entropy
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Bock, H. H., A clustering technique for maximizing \(\phi \)-divergence, noncentrality and discrimination power, (Schader, M., Analyzing and Modelling Data and Knowledge (1992), Springer: Springer Berlin), 19-36
[2] Cziszár, I., 1973. Generalized entropy and quantization problems. Transactions of the 6th Prague Conference Information Theory, Statistics and Decision Functions, Random Processes. Academia, Prague, pp. 159-174.; Cziszár, I., 1973. Generalized entropy and quantization problems. Transactions of the 6th Prague Conference Information Theory, Statistics and Decision Functions, Random Processes. Academia, Prague, pp. 159-174. · Zbl 0286.94020
[3] De Groot, M. H., Optimal Statistical Decisions (1970), McGraw Hill: McGraw Hill New York · Zbl 0225.62006
[4] Feller, W., 1966. An Introduction to Probability Theory and its Applications, vol. 2, second ed. Wiley, New York.; Feller, W., 1966. An Introduction to Probability Theory and its Applications, vol. 2, second ed. Wiley, New York. · Zbl 0138.10207
[5] Ghurye, S. G.; Johnson, B. R., Discrete approximations to the information integral, Canad. J. Statist., 9, 27-37 (1981) · Zbl 0473.62007
[6] Graf, S.; Luschgy, H., Foundations of Quantization for Probability Distributions (2000), Springer: Springer Berlin · Zbl 0951.60003
[7] Kallenberg, W. C.M.; Oosterhoff, J.; Schriever, B. F., The number of classes in chi-squared goodness-of-fit tests, J. Amer. Statist. Assoc., 80, 959-968 (1985) · Zbl 0582.62037
[8] Liese, F.; Vajda, I., Convex Statistical Distances (1987), Teubner: Teubner Leipzig · Zbl 0656.62004
[9] Mayoral, A. M.; Morales, D.; Morales, J.; Vajda, I., On efficiency of estimation and testing with data quantized to fixed number of cells, Metrika, 57, 1-27 (2003) · Zbl 1433.62021
[10] Menéndez, M. L.; Morales, D.; Pardo, L.; Vajda, I., Minimum disparity estimators for discrete and continuous models, Appl. Math., 46, 439-466 (2001) · Zbl 1059.62001
[11] Österreicher, F.; Vajda, I., Statistical information and discrimination, IEEE Trans. Inform. Theory, 39, 1036-1039 (1993) · Zbl 0792.62005
[12] Poetzelberger, K.; Strasser, H., Clustering and quantization by MSP-partitions, Statist. Decisions, 19, 331-371 (2001) · Zbl 1180.62093
[13] Vajda, I., On convergence of information contained in quantized observations, IEEE Trans. Inform. Theory, 48, 2163-2172 (2002) · Zbl 1062.94533
[14] Zografos, K.; Ferentinos, K.; Papaioannou, T., Discrete approximations of Cziszár, Rényi and Fisher measures of information, Canad. J. Statist., 14, 355-366 (1986) · Zbl 0624.62008
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.