## Statistical tests of optimality of source codes.(English)Zbl 0857.62004

Summary: For newly defined data compaction codes, as well as for the traditional data compression codes, we prove an asymptotic uniformity of probabilities of codewords, a kind of the asymptotic equipartition property. On the basis of this we propose an easily applicable Neyman-Pearson test of optimality of a code with a given asymptotic level of significance $$0 < \alpha < 1$$. The test is based on the sample entropy of code.

### MSC:

 62B10 Statistical aspects of information-theoretic topics 94A24 Coding theorems (Shannon theory) 94A99 Communication, information 94B99 Theory of error-correcting codes and error-detecting codes
Full Text:

### References:

 [1] T. Berger: Rate Distortion Theory. Prentice Hall, Englewood Cliffs 1971. [2] R. E. Blahut: Principles and Practice of Information Theory. Addison-Wesley, Reading 1987. · Zbl 0681.94001 [3] T. M. Cover, J. A. Thomas: Elements of Information Theory. Wiley, New York 1991. · Zbl 0762.94001 [4] J. Feistauerova: An entropic test of uniformity and optimality of codebooks. Problems Control Inform. Theory 17 (1988), 319-325. · Zbl 0666.62006 [5] J. Feistauerova, I. Vajda: Testing system entropy and prediction error probability. IEEE Trans. Systems Man Cybernet. 23 (1993), 5, 1352-1358. [6] R. M. Gray, J. L. Davisson: Source coding theorems without the ergodic assumption. IEEE Trans. Inform. Theory 20 (1974), 502-516. · Zbl 0301.94026 [7] E. L. Lehman: Testing Statistical Hypotheses. Wiley, New York 1962.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.