Strong data-processing inequalities for channels and Bayesian networks
From MaRDI portal
Abstract: The data-processing inequality, that is, for a Markov chain , has been the method of choice for proving impossibility (converse) results in information theory and many other disciplines. Various channel-dependent improvements (called strong data-processing inequalities, or SDPIs) of this inequality have been proposed both classically and more recently. In this note we first survey known results relating various notions of contraction for a single channel. Then we consider the basic extension: given SDPI for each constituent channel in a Bayesian network, how to produce an end-to-end SDPI? Our approach is based on the (extract of the) Evans-Schulman method, which is demonstrated for three different kinds of SDPIs, namely, the usual Ahslwede-G'acs type contraction coefficients (mutual information), Dobrushin's contraction coefficients (total variation), and finally the -curve (the best possible non-linear SDPI for a given channel). Resulting bounds on the contraction coefficients are interpreted as probability of site percolation. As an example, we demonstrate how to obtain SDPI for an -letter memoryless channel with feedback given an SDPI for . Finally, we discuss a simple observation on the equivalence of a linear SDPI and comparison to an erasure channel (in the sense of "less noisy" order). This leads to a simple proof of a curious inequality of Samorodnitsky (2015), and sheds light on how information spreads in the subsets of inputs of a memoryless channel.
Recommendations
- Strong Data Processing Inequalities and <inline-formula> <tex-math notation="LaTeX">$\Phi $ </tex-math> </inline-formula>-Sobolev Inequalities for Discrete Channels
- Strong Data Processing Inequalities for Input Constrained Additive Noise Channels
- Data-Processing Inequalities Based on a Certain Structured Class of Information Measures With Application to Estimation Theory
- A strengthened data processing inequality for the Belavkin-Staszewski relative entropy
- A New Data Processing Inequality and Its Applications in Distributed Source and Channel Coding
- Proving and Disproving Information Inequalities: Theory and Scalable Algorithms
- scientific article; zbMATH DE number 1152974
- Information inequalities for the Bayes risk
- Strong Converse Bounds in Quantum Network Information Theory
Cited in
(9)- Universal Features for High-Dimensional Learning and Inference
- scientific article; zbMATH DE number 7370576 (Why is no real title available?)
- Non-linear log-Sobolev inequalities for the Potts semigroup and applications to reconstruction problems
- Strong Data Processing Inequalities for Input Constrained Additive Noise Channels
- Comparison of contraction coefficients for f-divergences
- Application of the information-percolation method to reconstruction problems on graphs
- Complete entropic inequalities for quantum Markov chains
- An information-percolation bound for spin synchronization on general graphs
- Offline reinforcement learning with representations for actions
This page was built for publication: Strong data-processing inequalities for channels and Bayesian networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2406343)