Comparison of contraction coefficients for \(f\)-divergences
From MaRDI portal
Publication:784380
DOI10.1134/S0032946020020015zbMath1457.94063arXiv1510.01844OpenAlexW3042492913MaRDI QIDQ784380
Publication date: 3 August 2020
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1510.01844
contraction coefficientmaximal correlation\(f\)-divergence/relative \(f\)-entropyless noisy preorderstrong data processing inequality
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
Universal Features for High-Dimensional Learning and Inference, The \(f\)-divergence and coupling of probability distributions, On the maximum \(f\)-divergence of probability distributions given the value of their coupling
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Asymptotic methods in statistical decision theory
- Non-negative matrices and Markov chains. 2nd ed
- Relative entropy under mappings by stochastic matrices
- Spreading of sets in product spaces and hypercontraction of the Markov operator
- Equivalence of certain entropy contraction coefficients
- A variational characterization of canonical angles between subspaces
- Strong data-processing inequalities for channels and Bayesian networks
- Diffusion maps
- A class of measures of informativity of observation channels
- Obtaining Measure Concentration from Markov Contraction
- $f$ -Divergence Inequalities
- On the Entropy of a Noisy Function
- Strong Data Processing Inequalities and <inline-formula> <tex-math notation="LaTeX">$\Phi $ </tex-math> </inline-formula>-Sobolev Inequalities for Discrete Channels
- Dissipation of Information in Channels With Input Constraints
- Ergodicity Coefficients Defined by Vector Norms
- Network Information Theory
- On Sequences of Pairs of Dependent Random Variables
- The Structure of Bivariate Distributions
- On measures of dependence
- Context tree estimation for not necessarily finite memory processes, via BIC and MDL
- A Distribution Dependent Refinement of Pinsker's Inequality
- On Divergences and Informations in Statistics and Information Theory
- Estimating Optimal Transformations for Multiple Regression and Correlation
- The Geometric Interpretation of Correspondence Analysis
- Information theoretic inequalities
- Maximal coupling
- Coefficients of ergodicity: structure and applications
- On a special class of broadcast channels with confidential messages
- The efficiency of investment information
- Signal propagation and noisy circuits
- Polynomial Singular Value Decompositions of a Family of Source-Channel Models
- Strong Data Processing Inequalities for Input Constrained Additive Noise Channels
- Estimation Efficiency Under Privacy Constraints
- Refinements of Pinsker's inequality
- Comparison of Channels: Criteria for Domination by a Symmetric Channel
- On Choosing and Bounding Probability Metrics
- On functionals satisfying a data-processing theorem
- Evaluation of Marton's Inner Bound for the General Broadcast Channel
- A Coordinate System for Gaussian Networks
- On Pairs of $f$-Divergences and Their Joint Range
- A New Data Processing Inequality and Its Applications in Distributed Source and Channel Coding
- On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$-Divergences
- Principal Inertia Components and Applications
- Elements of Information Theory
- Markov Processes and the H-Theorem
- RELATIONS BETWEEN TWO SETS OF VARIATES
- On Information and Sufficiency
- Information Theory
- Information Theory and Statistics: A Tutorial