Strong Chain Rules for Min-Entropy under Few Bits Spoiled
From MaRDI portal
Publication:6283685
Abstract: It is well established that the notion of min-entropy fails to satisfy the emph{chain rule} of the form , known for Shannon Entropy. Such a property would help to analyze how min-entropy is split among smaller blocks. Problems of this kind arise for example when constructing extractors and dispersers. We show that any sequence of variables exhibits a very strong strong block-source structure (conditional distributions of blocks are nearly flat) when we emph{spoil few correlated bits}. This implies, conditioned on the spoiled bits, that emph{splitting-recombination properties} hold. In particular, we have many nice properties that min-entropy doesn't obey in general, for example strong chain rules, "information can't hurt" inequalities, equivalences of average and worst-case conditional entropy definitions and others. Quantitatively, for any sequence of random variables over an alphabet we prove that, when conditioned on bits of auxiliary information, all conditional distributions of the form are -close to be nearly flat (only a constant factor away). The argument is combinatorial (based on simplex coverings). This result may be used as a generic tool for emph{exhibiting block-source structures}. We demonstrate this by reproving the fundamental converter due to Nisan and Zuckermann (emph{J. Computer and System Sciences, 1996}), which shows that sampling blocks from a min-entropy source roughly preserves the entropy rate. Our bound implies, only by straightforward chain rules, an additive loss of (for sufficiently many samples), which qualitatively meets the first tighter analysis of this problem due to Vadhan (emph{CRYPTO'03}), obtained by large deviation techniques.
This page was built for publication: Strong Chain Rules for Min-Entropy under Few Bits Spoiled
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6283685)