×

Entropy and prefixes. (English) Zbl 0765.28016

Let \(A\) be a finite alphabet and let \(X=A^ N\) be the usual compact metrizable space of sequences \(x=x_ 1x_ 2\dots\) drawn from \(A\). Let \(\sigma\) denote the one-sided shift on \(X\). Any \(\sigma\)-invariant probability measure \(\mu\) on \(X\) defines a so-called symbolic process, also denoted by \(\mu\). For any \(x\in X\) let \[ L^ n_ 1(x)=\min\{\ell; x_ i\cdots x_{i+\ell-1}\neq x_ j\cdots x_{j+\ell-1}, 1\leq j\leq n,\;j\neq i\} \] and assume that \(\mu\) is ergodic. P. Grassberger [IEEE Trans. Inf. Theory IT-35, 669-675 (1989)] proposed that \[ \lim_{n\to\infty}{n\log n\over\sum^ n_{i=1}L^ n_ i(x)}=H,\;\mu-\text{a.e.}, \tag{C} \] where \(H=H(\mu)\) denotes the entropy of \(\mu\). In this paper the author proves that the conjecture (C) is true if \(\mu\) is Bernoulli, mixing Markov or entropy-zero. He shows that the general case is not true by exhibiting counterexamples which are very weak Bernoulli processes. But a weaker conjecture is proved, namely: If \(H>0\), given \(\varepsilon>0\), then for \(\mu\)-a.e. \(x\) there exists \(N=N(x,\varepsilon)\) such that for \(n\geq N\) all but an \(\varepsilon\) fraction of the numbers \(L^ n_ i(x)/\log n\) \((i=1,2,\dots,n)\) will be within \(\varepsilon\) of \(1/H\). Using above counterexamples, a related conjecture about Hausdorff dimension also is shown to be false.

MSC:

28D20 Entropy and other invariants
28D10 One-parameter continuous families of measure-preserving transformations
94A17 Measures of information, entropy
60F15 Strong limit theorems
PDFBibTeX XMLCite
Full Text: DOI