McMillan, Brockway The basic theorems of information theory. (English) Zbl 0050.35501 Ann. Math. Stat. 24, 196-219 (1953). A neat presentation, in terms of stationary time series, of the main concepts and results of Shannon’s communication theory in the time-discrete case. As examples we shall give the definition of the entropy rate, and the formulation in a somewhat stronger version, due to the present author – of the equipartition property for long sequences of letters. Let \(x=\{x_i\}\), where the \(x_i\) take on a finite number of values, be a stationary time series with probability measure \(\mu\) and consider the stochastic variables \(f_n(x)=-n^{-1}\log \mu([x_0,x_1,\ldots,x_{n-1}])\) where \([x_0,x_1,\ldots,x_{n-1}]\) is the cylindric set of all realizations taking for \(t=0,\ldots,n-1\) the values indicated, these latter being the corresponding components of the realization \(x\) on the left hand side. Then the entropy rate is \(H=\lim E\{f_n(x)\}\), and we have. provided that the time series is ergodic, \[ E\{| f_n(x)-H|\} \to \infty\quad\text{as}\;n\to\infty. \] Reviewer: G. Elfving Page: −5 −4 −3 −2 −1 ±0 +1 +2 +3 +4 +5 Show Scanned Page Cited in 6 ReviewsCited in 90 Documents MSC: 94A15 Information theory (general) 94A17 Measures of information, entropy 94A05 Communication theory Keywords:information theory; Shannon’s communication theory; discrete case; stationary time series × Cite Format Result Cite Review PDF Full Text: DOI