×

On the increase of conditional entropy in Markov chains. (English) Zbl 0707.60059

Information theory, statistical decision functions, random processes, Trans. 10th Prague Conf., Prague/Czech. 1986, Vol. A, 391-396 (1988).
Summary: [For the entire collection see Zbl 0701.00027.]
Convex increase in n of the n-step conditional entropy \(H(X_ n| X_ 0)\) is shown for a stationary Markov chain \(X_ 0,X_ 1,..\).. The convergence of the conditional entropy directly gives a simple variation of the Rényi’s information-theoretic proof of the classical Markov’s limit theorem for ergodic chains. The convex increase of the conditional entropy is then considered in the case of doubly stochastic transitions.
Sufficient conditions are discussed for the entropy of the state distribution to present an identical convex-increasing behavior regardless of the initial state, using latin squares, permutation matrices, and groups.

MSC:

60J10 Markov chains (discrete-time Markov processes on discrete state spaces)

Citations:

Zbl 0701.00027