Entropy and the consistent estimation of joint distributions (Q1336572): Difference between revisions
From MaRDI portal
Removed claim: reviewed by (P1447): Item:Q192741 |
Changed an Item |
||
Property / reviewed by | |||
Property / reviewed by: Min-ping Qian / rank | |||
Normal rank |
Revision as of 18:03, 10 February 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Entropy and the consistent estimation of joint distributions |
scientific article |
Statements
Entropy and the consistent estimation of joint distributions (English)
0 references
13 February 1995
0 references
The empirical \(k\)-block distribution \(\widehat\mu_ k(a^ k_ 1)\) of \(a^ n_ 1= (a_ 1,\dots,a_ n)\) is defined by its frequency appearing consecutively in the sample \((x_ 1,\dots,x_ n)\) of an ergodic finite alphabet process. \(k(n)\) \((\leq n)\) is said to be admissible for the corresponding ergodic measure \(\mu\) if \(\sum_{a^ k_ 1} |\widehat\mu_{k(n)}(a^ k_ 1)- \mu_{k(n)}(a^ k_ 1)|\to 0\) \((n\to \infty)\) a.s. It is proven that for the ergodic \(\mu\) with positive entropy \(H\), \(k(n)\) is not admissible if \(k(n)\geq \log n/(H- \varepsilon)\), while in case of the process being weak Bernoulli, which includes i.i.d. processes, \(\varphi\)-mixing processes, aperiodic Markov chains and functions thereof and aperiodic renewal processes, it is admissible if \(k(n)\leq \log n/(H+\varepsilon)\).
0 references
\(k\)-block distribution
0 references
ergodic finite alphabet process
0 references
admissible
0 references
ergodic measure
0 references
entropy
0 references
weak Bernoulli
0 references
\(\mu\)-mixing processes
0 references