Bounds from a card trick (Q414410): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
Property / OpenAlex ID
 
Property / OpenAlex ID: W1988637266 / rank
 
Normal rank

Revision as of 22:28, 19 March 2024

scientific article
Language Label Description Also known as
English
Bounds from a card trick
scientific article

    Statements

    Bounds from a card trick (English)
    0 references
    0 references
    11 May 2012
    0 references
    The author considers a string \(s\) of length \(n\) whose characters are drawn uniformly at random from an alphabet of size \(\sigma\). He first shows that for \(k \geq (1 + \epsilon) \log_\sigma n\) and \(\lambda = o (n^\epsilon)\), in the expected case we cannot store \(s\) in \(\lambda n H_k (s) + o (n \log \sigma)\) bits, where \(H_k (s)\) is the \(k\)th-order empirical entropy of \(s\). He then shows that for \(k \geq (2 + \epsilon) \log_\sigma n\), with high probability \(s\) contains no duplicate \(k\)-tuples, so \(H_k (s) = 0\) and we cannot store \(s\) in \(\lambda n H_k (s) + o (n \log \sigma)\) bits for any \(\lambda\). Finally, by similar arguments he shows that, to have at least a \(2/3\) probability of guessing correctly whether a \(\sigma\)-ary string is generated by a \(k\)th-order Markov source with entropy nearly 0 or by an unbiased memoryless source, we must read \(\Omega (\sigma^{k / 2})\) characters.
    0 references
    0 references
    empirical entropy
    0 references
    estimating entropy
    0 references
    Markov sources
    0 references

    Identifiers