Cover, Thomas M.; Thomas, Joy A. Elements of information theory. 2nd ed. (English) Zbl 1140.94001 Wiley-Interscience. Hoboken, NJ: John Wiley & Sons (ISBN 0-471-24195-4/hbk; 0-471-74881-1/ebook). xxiii, 748 p. (2006). The second edition of the book contains the following 17 chapters: Introduction and preview; Entropy, Relative entropy, and Mutual information; Asymptotic equipartition property; Entropy rates of a stochastic process; Data compression; Gambling and data compression; Channel capacity; Differential entropy; Gaussian channel; Rate distortion theory; Information theory and statistics; Maximum entropy; Universal source coding; Kolmogorov complexity; Network information theory; Information theory and portfolio theory; and inequalities in information theory. The book may be used for a two-quarter graduate course. The mathematical level is reasonably high but does not require the knowledge of measure theory. At least a good one semester course in probability theory is needed to follow the book. The book is nicely written with lots of applications, examples and unsolved problems and is highly recommended to those who want to study information theory.For the review of the first edition (1991) see Zbl 0762.94001. Reviewer: Pushpa N. Rathie (Brasilia) Cited in 1 ReviewCited in 1349 Documents MSC: 94-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory 94A15 Information theory (general) 94A17 Measures of information, entropy 62B10 Statistical aspects of information-theoretic topics 94A34 Rate-distortion theory in information and communication theory Keywords:probability theory; asymptotic equipartition property; Kolmogorov complexity; channel capacity; error probability; channel coding theorem; coding theorem; Gaussian channel; maximum entropy; spectral estimation; method of types; law of large numbers; large deviation theory; hypothesis testing; Neyman Pearson lamma; Chernoff bound; universal coding; Fisher information; Cramer Rao inequality; rate distortion theory; network information theory; multiple access channel; broadcast channel; relay channel; Slepian-Wolf theorem; information theory; Fano’s inequality; data compression; source codes; Kraft inequality; Shannon-Fano-Elias coding; Turing machine; halting problem; Hamming codes; multi-user information theory; inference channel; two way channel; correlated sources; side information; thermodynamics Citations:Zbl 0762.94001 PDFBibTeX XMLCite \textit{T. M. Cover} and \textit{J. A. Thomas}, Elements of information theory. 2nd ed. Hoboken, NJ: John Wiley \& Sons (2006; Zbl 1140.94001) Full Text: DOI