Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals
From MaRDI portal
Publication:2465342
DOI10.1016/J.INS.2007.07.017zbMATH Open1125.94005arXivmath-ph/0601035OpenAlexW1970048394MaRDI QIDQ2465342FDOQ2465342
Authors: Ambedkar Dukkipati, Shalabh Bhatnagar, M. Narasimha Murty
Publication date: 3 January 2008
Published in: Information Sciences (Search for Journal in Brave)
Abstract: The measure-theoretic definition of Kullback-Leibler relative-entropy (KL-entropy) plays a basic role in the definitions of classical information measures. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic definitions are key to extending the ergodic theorems of information theory to non-discrete cases. A fundamental theorem in this respect is the Gelfand-Yaglom-Perez (GYP) Theorem (Pinsker, 1960, Theorem. 2.4.2) which states that measure-theoretic relative-entropy equals the supremum of relative-entropies over all measurable partitions. This paper states and proves the GYP-theorem for Renyi relative-entropy of order greater than one. Consequently, the result can be easily extended to Tsallis relative-entropy.
Full work available at URL: https://arxiv.org/abs/math-ph/0601035
Recommendations
- scientific article; zbMATH DE number 3173999
- Generalized 'useful' relative information measures of type \((\alpha, \beta)\)
- A short characterization of relative entropy
- A generalization of the Kullback-Leibler divergence and its properties
- On the connections of generalized entropies with Shannon and Kolmogorov-Sinai entropies
Cites Work
- On Information and Sufficiency
- Title not available (Why is that?)
- The world according to Rényi: Thermodynamics of multifractal systems
- Possible generalization of Boltzmann-Gibbs statistics.
- Title not available (Why is that?)
- Generalized information functions
- Title not available (Why is that?)
- On measures of information and their characterizations
- Calculation of the amount of information about a random function contained in another such function
- Title not available (Why is that?)
- A coding theorem and Rényi's entropy
- Title not available (Why is that?)
- Expressions for Rényi and Shannon entropies for bivariate distributions
- Renyi's entropy as an index of diversity in simple-stage cluster sampling
- Title not available (Why is that?)
- The many facets of entropy
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Basic properties of the generalized Boltzmann-Gibbs-Shannon entropy
Cited In (4)
This page was built for publication: Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2465342)