Probability and information. A transl. of the 3rd rev. and enl. Russian ed. by V.K. Jain. (English) Zbl 0544.94001

Theory and Decision Library, Vol. 35. Dordrecht-Boston-Lancaster: D. Reidel Publishing Company, a member of the Kluwer Academic Publishers Group. XX, 421 p. Dfl. 190.00; $69.00 (1983).
The aim of this book is to introduce the scientific information theory in a gradual way, and so that it can be followed by high school and undergraduate students and teachers, and by readers interested in the subject with a little mathematical background. This purpose has been achieved by avoiding an excessive mathematical rigour, and paying great attention to the intuitive interpretation of concepts and fundamental results. Nevertheless, throughout the consecutive editions this book has been enlarged (by introducing some new methods and problems and by increasing slightly its scientific rigour) maintaining its original purpose.
The contents of the chapters in the present book are the following:
Chapter 1 starts with the statistical definition of probability (as a theoretical approach to the concept of probability), and goes on with the adoption of the classical definition of probability (as a practical approach which is very suitable in most problems of this book). Relevant properties of probability and related concepts are then studied. In order to incorporate a bit of rigour in the problem of coding (Chapter 4) some weak laws of large numbers are also recalled. Chapter 1 finishes with the structure of Boolean Algebra and the axiomatic definition of probability (theoretical approach which can be applied in a more general way than previous definitions).
In Chapter 2 the entropy is introduced as a measure of uncertainty of an experiment with a finite number of outcomes whose probabilities are known. The authors first remark the suitability of Shannon’s entropy against Hartley’s entropy, defined earlier, and then emphasize that Shannon’s entropy only depends on the outcomes probabilities which is a plausible condition for the transmission problems. The entropy of a compound experiment, and the entropy of an experiment given the outcome of another one can be immediately established. From the last concept the conditional entropy is defined, and accurate and boundary relations among ”joint”, ”marginal” and conditional entropies are analyzed. Later, a measure of the amount of information with respect to an experiment contained in another experiment is introduced as a reduction in entropy. In addition, the entropy is interpreted in terms of the amount of information. Finally, the authors develop an axiomatic characterization for Shannon’s entropy, determining simple proofs (that is, proofs which require a lower mathematical knowledge) than other axiomatics (Fadeev, Aczél, Daróczy, Forte and Ng).
Chapter 3 makes use of concepts and results in Chapter 2 in order to solve certain logical problems, such as ”the determination of the least number of questions for obtaining a specified purpose”, ”the determination of the least number of weighings for finding a counterfeit coin”, and other similar problems, under different conditions. Since arguments in the solution of the preceding problems will be useful in the solution of certain engineering problems, a general formulation of all problems is given in terms of the general idea of the logical ones.
Chapter 4 deals with the application of concepts of entropy and information to the problem of the information transmission through communication channels (engineering problem). Firstly, such basic concepts as message, code, efficiency of a code, and so on, are introduced. Secondly, Huffman code (which is the most efficient code in a certain sense) is described, and the fundamental noiseless coding theorem connecting the noiseless coding of a message with Shannon’s entropy is established. Then, the authors consider the estimation of the information and entropy contained in messages in human communication, such as written language (fundamentally written English), spoken language (fundamentally spoken English), music, continuously varying messages (e.g., television images), phototelegrams, engineering communication and genetic information transmission. Later, the fundamental noisy coding theorem connecting the noisy coding with Shannon’s entropy is established. Finally, error-detecting and error-correcting codes, as well as some methods for constructing them (methods which are based on the algebraic coding theory) are analyzed.
In addition, basic properties of convex functions, algebraic concepts and useful tables are collected in several appendices.
Reviewer: M.A.Gil Alvarez


94-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory
94A15 Information theory (general)
94A17 Measures of information, entropy
60-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to probability theory
94A24 Coding theorems (Shannon theory)
94B05 Linear codes (general theory)