A divergence formula for randomness and dimension (Q616503)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A divergence formula for randomness and dimension |
scientific article |
Statements
A divergence formula for randomness and dimension (English)
0 references
10 January 2011
0 references
The reviewer considers the first part of the author's abstract as most illustrative of the content of the paper. ``If \(S\) is an infinite sequence over a finite alphabet \(\Sigma\) and \(\beta\) is a probability measure on \(\Sigma\), then the \textit{dimension} of \(S\) with respect to \(\beta\), written \(\mathrm{dim}^\beta(S)\), is a constructive version of the Billingsley dimension that coincides with the (constructive Hausdorff) dimension \(\mathrm{dim}(S)\) when \(\beta\) is the uniform probability measure. This paper shows that \(\mathrm{dim}^\beta(S)\) and its dual \(\mathrm{Dim}^\beta(S)\), the \textit{strong dimension} of \(S\) with respect to \(\beta\), can be used in conjunction with randomness to measure the similarity of two probability measures \(\alpha\) and \(\beta\) on \(\Sigma\). Specifically, we prove that the divergence formula \[ \mathrm{dim}^\beta(R)=\mathrm{Dim}^\beta(R)= \frac{\mathcal H (\alpha)}{\mathcal H (\alpha)+ \mathcal{D}(\alpha\,\|\,\beta)} \] holds whenever \(\alpha\) and \(\beta\) are computable, positive probability measures on \(\Sigma\) and \(R\in \Sigma^\infty\) is random with respect to \(\alpha\). In this formula, \(\mathcal H (\alpha)\) is the Shannon entropy of \(\alpha\), and \(\mathcal D(\alpha\, \|\, \beta)\) is the Kullback-Leibler divergence between \(\alpha\) and \(\beta\).''
0 references
constructive dimension
0 references
finite-state dimension
0 references
Kolmogorov complexity
0 references
Kullback-Leibler divergence
0 references
randomness
0 references
Shannon entropy
0 references
0 references