Entropy and the consistent estimation of joint distributions (Q1336572)

From MaRDI portal
Revision as of 22:41, 19 March 2024 by Openalex240319060354 (talk | contribs) (Set OpenAlex properties.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
Entropy and the consistent estimation of joint distributions
scientific article

    Statements

    Entropy and the consistent estimation of joint distributions (English)
    0 references
    0 references
    0 references
    13 February 1995
    0 references
    The empirical \(k\)-block distribution \(\widehat\mu_ k(a^ k_ 1)\) of \(a^ n_ 1= (a_ 1,\dots,a_ n)\) is defined by its frequency appearing consecutively in the sample \((x_ 1,\dots,x_ n)\) of an ergodic finite alphabet process. \(k(n)\) \((\leq n)\) is said to be admissible for the corresponding ergodic measure \(\mu\) if \(\sum_{a^ k_ 1} |\widehat\mu_{k(n)}(a^ k_ 1)- \mu_{k(n)}(a^ k_ 1)|\to 0\) \((n\to \infty)\) a.s. It is proven that for the ergodic \(\mu\) with positive entropy \(H\), \(k(n)\) is not admissible if \(k(n)\geq \log n/(H- \varepsilon)\), while in case of the process being weak Bernoulli, which includes i.i.d. processes, \(\varphi\)-mixing processes, aperiodic Markov chains and functions thereof and aperiodic renewal processes, it is admissible if \(k(n)\leq \log n/(H+\varepsilon)\).
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    \(k\)-block distribution
    0 references
    ergodic finite alphabet process
    0 references
    admissible
    0 references
    ergodic measure
    0 references
    entropy
    0 references
    weak Bernoulli
    0 references
    \(\mu\)-mixing processes
    0 references
    0 references
    0 references