Entropy and prefixes (Q1184096)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Entropy and prefixes
scientific article

    Statements

    Entropy and prefixes (English)
    0 references
    0 references
    28 June 1992
    0 references
    Let \(A\) be a finite alphabet and let \(X=A^ N\) be the usual compact metrizable space of sequences \(x=x_ 1x_ 2\dots\) drawn from \(A\). Let \(\sigma\) denote the one-sided shift on \(X\). Any \(\sigma\)-invariant probability measure \(\mu\) on \(X\) defines a so-called symbolic process, also denoted by \(\mu\). For any \(x\in X\) let \[ L^ n_ 1(x)=\min\{\ell; x_ i\cdots x_{i+\ell-1}\neq x_ j\cdots x_{j+\ell-1}, 1\leq j\leq n,\;j\neq i\} \] and assume that \(\mu\) is ergodic. \textit{P. Grassberger} [IEEE Trans. Inf. Theory IT-35, 669-675 (1989)] proposed that \[ \lim_{n\to\infty}{n\log n\over\sum^ n_{i=1}L^ n_ i(x)}=H,\;\mu-\text{a.e.}, \tag{C} \] where \(H=H(\mu)\) denotes the entropy of \(\mu\). In this paper the author proves that the conjecture (C) is true if \(\mu\) is Bernoulli, mixing Markov or entropy-zero. He shows that the general case is not true by exhibiting counterexamples which are very weak Bernoulli processes. But a weaker conjecture is proved, namely: If \(H>0\), given \(\varepsilon>0\), then for \(\mu\)-a.e. \(x\) there exists \(N=N(x,\varepsilon)\) such that for \(n\geq N\) all but an \(\varepsilon\) fraction of the numbers \(L^ n_ i(x)/\log n\) \((i=1,2,\dots,n)\) will be within \(\varepsilon\) of \(1/H\). Using above counterexamples, a related conjecture about Hausdorff dimension also is shown to be false.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    symbolic dynamical systems
    0 references
    Markov process
    0 references
    entropy-zero process
    0 references
    entropy
    0 references
    Bernoulli processes
    0 references
    Hausdorff dimension
    0 references
    0 references