Strong Data Processing Inequalities for Input Constrained Additive Noise Channels

From MaRDI portal
Publication:4566751

DOI10.1109/TIT.2017.2782359zbMATH Open1390.94725arXiv1512.06429OpenAlexW2963898870MaRDI QIDQ4566751FDOQ4566751

Yihong Wu, Flรกvio du Pin Calmon, Yury Polyanskiy

Publication date: 27 June 2018

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: This paper quantifies the intuitive observation that adding noise reduces available information by means of non-linear strong data processing inequalities. Consider the random variables WoXoY forming a Markov chain, where Y=X+Z with X and Z real-valued, independent and X bounded in Lp-norm. It is shown that I(W;Y)leFI(I(W;X)) with FI(t)<t whenever t>0, if and only if Z has a density whose support is not disjoint from any translate of itself. A related question is to characterize for what couplings (W,X) the mutual information I(W;Y) is close to maximum possible. To that end we show that in order to saturate the channel, i.e. for I(W;Y) to approach capacity, it is mandatory that I(W;X)oinfty (under suitable conditions on the channel). A key ingredient for this result is a deconvolution lemma which shows that post-convolution total variation distance bounds the pre-convolution Kolmogorov-Smirnov distance. Explicit bounds are provided for the special case of the additive Gaussian noise channel with quadratic cost constraint. These bounds are shown to be order-optimal. For this case simplified proofs are provided leveraging Gaussian-specific tools such as the connection between information and estimation (I-MMSE) and Talagrand's information-transportation inequality.


Full work available at URL: https://arxiv.org/abs/1512.06429






Cited In (3)


   Recommendations





This page was built for publication: Strong Data Processing Inequalities for Input Constrained Additive Noise Channels

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4566751)