Strong Data Processing Inequalities for Input Constrained Additive Noise Channels
From MaRDI portal
Publication:4566751
Applications of Markov chains and discrete-time Markov processes on general state spaces (social mobility, learning theory, industrial processes, etc.) (60J20) Statistical aspects of information-theoretic topics (62B10) Inequalities; stochastic orderings (60E15) Channel models (including quantum) in information and communication theory (94A40)
Abstract: This paper quantifies the intuitive observation that adding noise reduces available information by means of non-linear strong data processing inequalities. Consider the random variables forming a Markov chain, where with and real-valued, independent and bounded in -norm. It is shown that with whenever , if and only if has a density whose support is not disjoint from any translate of itself. A related question is to characterize for what couplings the mutual information is close to maximum possible. To that end we show that in order to saturate the channel, i.e. for to approach capacity, it is mandatory that (under suitable conditions on the channel). A key ingredient for this result is a deconvolution lemma which shows that post-convolution total variation distance bounds the pre-convolution Kolmogorov-Smirnov distance. Explicit bounds are provided for the special case of the additive Gaussian noise channel with quadratic cost constraint. These bounds are shown to be order-optimal. For this case simplified proofs are provided leveraging Gaussian-specific tools such as the connection between information and estimation (I-MMSE) and Talagrand's information-transportation inequality.
Recommendations
- Strong Data Processing Inequalities and <inline-formula> <tex-math notation="LaTeX">$\Phi $ </tex-math> </inline-formula>-Sobolev Inequalities for Discrete Channels
- Strong data-processing inequalities for channels and Bayesian networks
- A New Data Processing Inequality and Its Applications in Distributed Source and Channel Coding
- MMSE Bounds for Additive Noise Channels Under Kullback–Leibler Divergence Constraints on the Input Distribution
- Capacity Bounds for Additive Symmetric $\alpha $ -Stable Noise Channels
- On Properties of the Support of Capacity-Achieving Distributions for Additive Noise Channel Models With Input Cost Constraints
- Worst case additive noise for binary-input channels and zero-threshold detection under constraints of power and divergence
- Bounds on the Capacity of Random Insertion and Deletion-Additive Noise Channels
- On some properties of Gaussian channels with strongly equivalent noises
- scientific article; zbMATH DE number 3412509
Cited in
(3)
This page was built for publication: Strong Data Processing Inequalities for Input Constrained Additive Noise Channels
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4566751)