# zbMATH — the first resource for mathematics

Coding and information theory. (English) Zbl 0752.94001
Graduate Texts in Mathematics. 134. New York: Springer-Verlag. xvii, 486 p. (1992).
We live in an age in which Coding and Information Theory is flourishing as never before. This is a branch of Mathematics which is distinguished by being rich in most beautiful theorems and theories and is, at the same time, forging tools for practical problems in the reliable communication of digitally encoded information. This theory was born in the 1948 [see C. Shannon, A mathematical theory of communication, Bell Syst. Tech. J. 27, 379-423, 623-656 (1948)]. To quote the author’s introduction: “The main problem of information and coding theory can be described in a simple way as follows. Imagine that a stream of source data, say in the form of bits (0’s and 1’s), is being transmitted over a communications channel, such as telephone line. From time to time, disruptions take place along the channel, causing some of the 0’s to be turned into 1’s, and vice-versa. The question is $$\ll$$How can we tell when the original source data has been changed, and when it has, how can we recover the original data?$$\gg$$”. This book is an introduction, not only for mathematics students, but also for statisticians and engineers, to the theories of information and codes. They are usually treated separately but, as they both address the problem of communication through noisy channels, the author have been able to exploit the connection to give a reasonably self-contained treament, relating the probabilistic and algebraic viewpoint. Pre-requisites for reading the book are nothing more than a basic knowledge of elementary probability, as well as a foundation in modern and linear algebra. The first quarter of the book is devoted to information theory; the remaining portion of the book is devoted to coding theory and has a decidedly algebraic flavour. The individual chapters and very brief sketches of their contents are as follows:
Chapter 1. Entropy of a source, properties of entropy. Chapter 2. Variable length encoding, Huffman encoding, the noiseless coding theorem. Chapter 3. The discrete memoryless channel and conditional entropy, mutual information and channel capacity, the noisy coding theorem. Chapter 4. General remarks on codes, minimum distance decoding, families of codes, codes and designs. Chapter 5. Linear codes, weight distributions, maximum distance separable codes, invariant theory and self-dual codes. Chapter 6. Hamming and Golay codes, Reed-Muller codes. Chapter 7. Finite fields and cyclic codes. Chapter 8. Some cyclic codes: BCH codes, Reed-Solomon and Justesen codes, alternant codes and Goppa codes, quadratic residue codes. Appendix. Algebraic preliminaries, MĂ¶bius inversion, binomial inequalities.
The volume concludes with a list of books on information and coding theory and of a few more advanced additional references for further reading. The general presentation of the material is attractive, the proofs and explanation are good, and there are lots of examples to provide concrete illustration of abstract ideas. Each chapter concludes with most helpful exercises. The book constitutes a valuable addition to the literature in this field and can be most warmly recommended.
Reviewer: G.Faina (Perugia)

##### MSC:
 94-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory 94Bxx Theory of error-correcting codes and error-detecting codes