Comparison of contraction coefficients for f-divergences
From MaRDI portal
Abstract: Data processing inequalities for -divergences can be sharpened using constants called "contraction coefficients" to produce strong data processing inequalities. For any discrete source-channel pair, the contraction coefficients for -divergences are lower bounded by the contraction coefficient for -divergence. In this paper, we elucidate that this lower bound can be achieved by driving the input -divergences of the contraction coefficients to zero. Then, we establish a linear upper bound on the contraction coefficients for a certain class of -divergences using the contraction coefficient for -divergence, and refine this upper bound for the salient special case of Kullback-Leibler (KL) divergence. Furthermore, we present an alternative proof of the fact that the contraction coefficients for KL and -divergences are equal for a Gaussian source with an additive Gaussian noise channel (where the former coefficient can be power constrained). Finally, we generalize the well-known result that contraction coefficients of channels (after extremizing over all possible sources) for all -divergences with non-linear operator convex are equal. In particular, we prove that the so called "less noisy" preorder over channels can be equivalently characterized by any non-linear operator convex -divergence.
Recommendations
- scientific article; zbMATH DE number 222635
- scientific article; zbMATH DE number 4197134
- Bounds for \(f\)-divergences under likelihood ratio constraints.
- On an Extension of the Notion off-Divergence
- $f$ -Divergence Inequalities
- Bregman divergences from comparative convexity
- Convergence of contrastive divergence algorithm in exponential family
Cites work
- scientific article; zbMATH DE number 3885093 (Why is no real title available?)
- scientific article; zbMATH DE number 3146421 (Why is no real title available?)
- scientific article; zbMATH DE number 3152029 (Why is no real title available?)
- scientific article; zbMATH DE number 3842950 (Why is no real title available?)
- scientific article; zbMATH DE number 4072103 (Why is no real title available?)
- scientific article; zbMATH DE number 3559440 (Why is no real title available?)
- scientific article; zbMATH DE number 3562959 (Why is no real title available?)
- scientific article; zbMATH DE number 1207031 (Why is no real title available?)
- scientific article; zbMATH DE number 1049347 (Why is no real title available?)
- scientific article; zbMATH DE number 1064667 (Why is no real title available?)
- scientific article; zbMATH DE number 1560711 (Why is no real title available?)
- scientific article; zbMATH DE number 1765864 (Why is no real title available?)
- scientific article; zbMATH DE number 3444596 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 6125590 (Why is no real title available?)
- scientific article; zbMATH DE number 3228255 (Why is no real title available?)
- scientific article; zbMATH DE number 3252891 (Why is no real title available?)
- scientific article; zbMATH DE number 3307119 (Why is no real title available?)
- scientific article; zbMATH DE number 3322635 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- scientific article; zbMATH DE number 3200971 (Why is no real title available?)
- scientific article; zbMATH DE number 3059245 (Why is no real title available?)
- $f$ -Divergence Inequalities
- A Coordinate System for Gaussian Networks
- A Distribution Dependent Refinement of Pinsker's Inequality
- A New Data Processing Inequality and Its Applications in Distributed Source and Channel Coding
- A class of measures of informativity of observation channels
- A class of modified Pearson and Neyman statistics
- A variational characterization of canonical angles between subspaces
- Asymptotic methods in statistical decision theory
- Clustering with Bregman divergences.
- Coefficients of ergodicity: structure and applications
- Comparison of Channels: Criteria for Domination by a Symmetric Channel
- Context tree estimation for not necessarily finite memory processes, via BIC and MDL
- Diffusion maps
- Dissipation of Information in Channels With Input Constraints
- Elements of Information Theory
- Equivalence of certain entropy contraction coefficients
- Ergodicity coefficients defined by vector norms
- Estimating Optimal Transformations for Multiple Regression and Correlation
- Estimation Efficiency Under Privacy Constraints
- Evaluation of Marton's Inner Bound for the General Broadcast Channel
- Information Theory and Statistics: A Tutorial
- Information theoretic inequalities
- Information theory. Coding theorems for discrete memoryless systems
- Markov Processes and the H-Theorem
- Markov chains and mixing times. With a chapter on ``Coupling from the past by James G. Propp and David B. Wilson.
- Maximal coupling
- Network information theory
- Non-negative matrices and Markov chains. 2nd ed
- Obtaining measure concentration from Markov contraction
- On Choosing and Bounding Probability Metrics
- On Divergences and Informations in Statistics and Information Theory
- On Information and Sufficiency
- On Pairs of $f$-Divergences and Their Joint Range
- On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$-Divergences
- On Sequences of Pairs of Dependent Random Variables
- On a special class of broadcast channels with confidential messages
- On functionals satisfying a data-processing theorem
- On measures of dependence
- On the Entropy of a Noisy Function
- Polynomial Singular Value Decompositions of a Family of Source-Channel Models
- Principal Inertia Components and Applications
- RELATIONS BETWEEN TWO SETS OF VARIATES
- Refinements of Pinsker's inequality
- Relative entropy under mappings by stochastic matrices
- Signal propagation and noisy circuits
- Spreading of sets in product spaces and hypercontraction of the Markov operator
- Strong Data Processing Inequalities and <inline-formula> <tex-math notation="LaTeX">$\Phi $ </tex-math> </inline-formula>-Sobolev Inequalities for Discrete Channels
- Strong Data Processing Inequalities for Input Constrained Additive Noise Channels
- Strong data-processing inequalities for channels and Bayesian networks
- The Geometric Interpretation of Correspondence Analysis
- The Structure of Bivariate Distributions
- The efficiency of investment information
- Trace inequalities and quantum entropy: an introductory course
Cited in
(3)
This page was built for publication: Comparison of contraction coefficients for \(f\)-divergences
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q784380)