Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals

From MaRDI portal
Publication:2465342

DOI10.1016/J.INS.2007.07.017zbMATH Open1125.94005arXivmath-ph/0601035OpenAlexW1970048394MaRDI QIDQ2465342FDOQ2465342


Authors: Ambedkar Dukkipati, Shalabh Bhatnagar, M. Narasimha Murty Edit this on Wikidata


Publication date: 3 January 2008

Published in: Information Sciences (Search for Journal in Brave)

Abstract: The measure-theoretic definition of Kullback-Leibler relative-entropy (KL-entropy) plays a basic role in the definitions of classical information measures. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic definitions are key to extending the ergodic theorems of information theory to non-discrete cases. A fundamental theorem in this respect is the Gelfand-Yaglom-Perez (GYP) Theorem (Pinsker, 1960, Theorem. 2.4.2) which states that measure-theoretic relative-entropy equals the supremum of relative-entropies over all measurable partitions. This paper states and proves the GYP-theorem for Renyi relative-entropy of order greater than one. Consequently, the result can be easily extended to Tsallis relative-entropy.


Full work available at URL: https://arxiv.org/abs/math-ph/0601035




Recommendations




Cites Work


Cited In (4)





This page was built for publication: Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2465342)