Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals
From MaRDI portal
(Redirected from Publication:2465342)
Abstract: The measure-theoretic definition of Kullback-Leibler relative-entropy (KL-entropy) plays a basic role in the definitions of classical information measures. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic definitions are key to extending the ergodic theorems of information theory to non-discrete cases. A fundamental theorem in this respect is the Gelfand-Yaglom-Perez (GYP) Theorem (Pinsker, 1960, Theorem. 2.4.2) which states that measure-theoretic relative-entropy equals the supremum of relative-entropies over all measurable partitions. This paper states and proves the GYP-theorem for Renyi relative-entropy of order greater than one. Consequently, the result can be easily extended to Tsallis relative-entropy.
Recommendations
Cites work
- scientific article; zbMATH DE number 1803724 (Why is no real title available?)
- scientific article; zbMATH DE number 3119624 (Why is no real title available?)
- scientific article; zbMATH DE number 3162348 (Why is no real title available?)
- scientific article; zbMATH DE number 3166608 (Why is no real title available?)
- scientific article; zbMATH DE number 3170754 (Why is no real title available?)
- scientific article; zbMATH DE number 48436 (Why is no real title available?)
- scientific article; zbMATH DE number 3228255 (Why is no real title available?)
- scientific article; zbMATH DE number 3238721 (Why is no real title available?)
- scientific article; zbMATH DE number 3284914 (Why is no real title available?)
- scientific article; zbMATH DE number 3062467 (Why is no real title available?)
- A coding theorem and Rényi's entropy
- Basic properties of the generalized Boltzmann-Gibbs-Shannon entropy
- Calculation of the amount of information about a random function contained in another such function
- Expressions for Rényi and Shannon entropies for bivariate distributions
- Generalized information functions
- On Information and Sufficiency
- On measures of information and their characterizations
- Possible generalization of Boltzmann-Gibbs statistics.
- Renyi's entropy as an index of diversity in simple-stage cluster sampling
- The many facets of entropy
- The world according to Rényi: Thermodynamics of multifractal systems
Cited in
(4)
This page was built for publication: Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2465342)