Comparison of contraction coefficients for f-divergences

From MaRDI portal
Publication:784380

DOI10.1134/S0032946020020015zbMATH Open1457.94063arXiv1510.01844OpenAlexW3042492913MaRDI QIDQ784380FDOQ784380


Authors: D. Kharzeev Edit this on Wikidata


Publication date: 3 August 2020

Published in: Problems of Information Transmission (Search for Journal in Brave)

Abstract: Data processing inequalities for f-divergences can be sharpened using constants called "contraction coefficients" to produce strong data processing inequalities. For any discrete source-channel pair, the contraction coefficients for f-divergences are lower bounded by the contraction coefficient for chi2-divergence. In this paper, we elucidate that this lower bound can be achieved by driving the input f-divergences of the contraction coefficients to zero. Then, we establish a linear upper bound on the contraction coefficients for a certain class of f-divergences using the contraction coefficient for chi2-divergence, and refine this upper bound for the salient special case of Kullback-Leibler (KL) divergence. Furthermore, we present an alternative proof of the fact that the contraction coefficients for KL and chi2-divergences are equal for a Gaussian source with an additive Gaussian noise channel (where the former coefficient can be power constrained). Finally, we generalize the well-known result that contraction coefficients of channels (after extremizing over all possible sources) for all f-divergences with non-linear operator convex f are equal. In particular, we prove that the so called "less noisy" preorder over channels can be equivalently characterized by any non-linear operator convex f-divergence.


Full work available at URL: https://arxiv.org/abs/1510.01844




Recommendations




Cites Work


Cited In (3)





This page was built for publication: Comparison of contraction coefficients for \(f\)-divergences

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q784380)