On the Entropy of a Noisy Function
From MaRDI portal
Publication:2976554
DOI10.1109/TIT.2016.2584625zbMATH Open1359.94359arXiv1508.01464OpenAlexW2963653371MaRDI QIDQ2976554FDOQ2976554
Authors: Alex Samorodnitsky
Publication date: 28 April 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: Let be a noise parameter, and let be the noise operator acting on functions on the boolean cube . Let be a nonnegative function on . We upper bound the entropy of by the average entropy of conditional expectations of , given sets of roughly variables. In information-theoretic terms, we prove the following strengthening of "Mrs. Gerber's lemma": Let be a random binary vector of length , and let be a noise vector, corresponding to a binary symmetric channel with crossover probability . Then, setting , we have (up to lower-order terms): HBig(X oplus ZBig) ge n cdot Hleft(epsilon ~+~ (1-2epsilon) cdot H^{-1}left(frac{{mathbb E}_{|B| = v} HBig({X_i}_{iin B}Big)}{v}
ight)
ight) As an application, we show that for a boolean function , which is close to a characteristic function of a subcube of dimension , the entropy of is at most that of . This, combined with a recent result of Ordentlich, Shayevitz, and Weinstein shows that the "Most informative boolean function" conjecture of Courtade and Kumar holds for high noise , for some absolute constant . Namely, if is uniformly distributed in and is obtained by flipping each coordinate of independently with probability , then, provided , for any boolean function holds .
Full work available at URL: https://arxiv.org/abs/1508.01464
Cited In (4)
This page was built for publication: On the Entropy of a Noisy Function
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2976554)