zbMATH — the first resource for mathematics

Elements of information theory. 2nd ed. (English) Zbl 1140.94001
Wiley-Interscience. Hoboken, NJ: John Wiley & Sons (ISBN 0-471-24195-4/hbk; 0-471-74881-1/ebook). xxiii, 748 p. (2006).
The second edition of the book contains the following 17 chapters: Introduction and preview; Entropy, Relative entropy, and Mutual information; Asymptotic equipartition property; Entropy rates of a stochastic process; Data compression; Gambling and data compression; Channel capacity; Differential entropy; Gaussian channel; Rate distortion theory; Information theory and statistics; Maximum entropy; Universal source coding; Kolmogorov complexity; Network information theory; Information theory and portfolio theory; and inequalities in information theory.
The book may be used for a two-quarter graduate course. The mathematical level is reasonably high but does not require the knowledge of measure theory. At least a good one semester course in probability theory is needed to follow the book.
The book is nicely written with lots of applications, examples and unsolved problems and is highly recommended to those who want to study information theory.
For the review of the first edition (1991) see Zbl 0762.94001.

94-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory
94A15 Information theory (general)
94A17 Measures of information, entropy
62B10 Statistical aspects of information-theoretic topics
94A34 Rate-distortion theory in information and communication theory
Full Text: DOI