×

Information theory. Part I: An introduction to the fundamental concepts. (English) Zbl 1392.94001

Hackensack, NJ: World Scientific (ISBN 978-981-3208-82-7/hbk; 978-981-3208-83-4/pbk). xix, 342 p. (2017).
The author states in preface: “the aim of this book is to provide the reader with an introduction to the fundamental concepts of Information Theory (IT), using simple, clear, and reader-friendly language”. The structure of the book is garbled: after five chapters come seven Appendices and then nearly twenty pages of notes. There are no new results and the presentation is often based on the author’s earlier publications (19 references).
The 0-th chapter covers elements of probability theory – frequencies, Bayes theorem, basic probability distributions, Markov chains. IT is considered in the following four chapters: “Introduction, Definition, and Interpretations of Shannon’s Measure of Information”, “Properties of Shannon’s Measure of Information”, “Conditional and Mutual Information”, “Multivariate Mutual Information”, “Entropy and the Second Law of Thermodynamics”. The presentation is user-friendly – many examples, some exercises, but is often based on subjective phrases – “Clearly,…”, “It is plausible …” etc.), what does not allow to consider this book as textbook.
However, the main theme of the book is not IT; besides, “IT does not solve any problem” (page 68). The main theme of the book is expressed in the dedication: “This book is dedicated to all those who confuse the concepts of Information with Shannon’s Measure of Information and Shannon’s Measure of Information with Entropy”. The first distinction – between something (e.g. information) and the measure of this is for readers with at least elementary education obvious thus the main point of the whole book is in considering the state of sin of those who do not understand the second one. The Shannon’s Measure of Information (the author uses heavily the overloaded abbreviation SMI) is defined on page 60: “we will refer to the quantity \(H\) defined above as the Shannon’s Measure of Information (SMI)”. The quantity \(H\) is explained earlier – this is the introduced by Shannon entropy, thus SMI = \(H\) = (Shannon’s) entropy. But soon appear disconcerting statements, e.g. on page 63 is first stated: “SMI is not information” and in the next sentence: “it is not a measure of any piece of information, but a very particular kind of information” (it is not explained, what is “a piece of information”).
Only in the last chapter appears the main message of the book: the quantity \(H\) should be called ‘entropy’ only when considering isolated systems at equilibrium (this concept is not defined), in all other cases (for systems not at equilibrium) \(H\) should be called ‘SMI’: “we define the concept of entropy as a special case of SMI”, i.e. entropy \( \subset \) SMI, “entropy is fixed”, entropy “…does not change with time”, “…does not tend to maximum”. The author states several times that “Shannon erred” (e.g. page 55) and the assumptions used by Shannon to prove the formula for \(H\) “do not apply to the general concept of information” (page 58; it is not explained, what is “the general concept of information”), but Shannon knew what he was speaking about and proved theorems with great practical value; this book presents only the author’s vision about the need to revise established use of terminology; whether the scientific community accepts this vision remains to be seen.

MSC:

94-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory
94A15 Information theory (general)
94A17 Measures of information, entropy
62B10 Statistical aspects of information-theoretic topics
PDFBibTeX XMLCite
Full Text: DOI