On the Entropy of a Noisy Function
From MaRDI portal
Publication:2976554
Abstract: Let be a noise parameter, and let be the noise operator acting on functions on the boolean cube . Let be a nonnegative function on . We upper bound the entropy of by the average entropy of conditional expectations of , given sets of roughly variables. In information-theoretic terms, we prove the following strengthening of "Mrs. Gerber's lemma": Let be a random binary vector of length , and let be a noise vector, corresponding to a binary symmetric channel with crossover probability . Then, setting , we have (up to lower-order terms): HBig(X oplus ZBig) ge n cdot Hleft(epsilon ~+~ (1-2epsilon) cdot H^{-1}left(frac{{mathbb E}_{|B| = v} HBig({X_i}_{iin B}Big)}{v}
ight)
ight) As an application, we show that for a boolean function , which is close to a characteristic function of a subcube of dimension , the entropy of is at most that of . This, combined with a recent result of Ordentlich, Shayevitz, and Weinstein shows that the "Most informative boolean function" conjecture of Courtade and Kumar holds for high noise , for some absolute constant . Namely, if is uniformly distributed in and is obtained by flipping each coordinate of independently with probability , then, provided , for any boolean function holds .
Cited in
(4)
This page was built for publication: On the Entropy of a Noisy Function
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2976554)