Rényi Divergence and Kullback-Leibler Divergence

From MaRDI portal
Publication:2986250

DOI10.1109/TIT.2014.2320500zbMATH Open1360.94180arXiv1206.2459OpenAlexW2026653933WikidataQ59408941 ScholiaQ59408941MaRDI QIDQ2986250FDOQ2986250


Authors: Tim van Erven, Peter Harremoës Edit this on Wikidata


Publication date: 16 May 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: R'enyi divergence is related to R'enyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by R'enyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the R'enyi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of R'enyi divergence and Kullback-Leibler divergence, including convexity, continuity, limits of sigma-algebras and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also show how to generalize the Pythagorean inequality to orders different from 1, and we extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders) and present several other minimax results.


Full work available at URL: https://arxiv.org/abs/1206.2459







Cited In (only showing first 100 items - show all)





This page was built for publication: Rényi Divergence and Kullback-Leibler Divergence

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986250)