Optimal guessing under nonextensive framework and associated moment bounds
From MaRDI portal
Publication:6165367
Abstract: We consider the problem of guessing the realization of a random variable but under more general Tsallis' non-extensive entropic framework rather than the classical Maxwell-Boltzman-Gibbs-Shannon framework. We consider both the conditional guessing problem in the presence of some related side information, and the unconditional one where no such side-information is available. For both types of the problem, the non-extensive moment bounds of the required number of guesses are derived; here we use the -normalized expectation in place of the usual (linear) expectation to define the non-extensive moments. These moment bounds are seen to be a function of the logarithmic norm entropy measure, a recently developed two-parameter generalization of the Renyi entropy, and hence provide their information theoretic interpretation. We have also considered the case of uncertain source distribution and derived the non-extensive moment bounds for the corresponding mismatched guessing function. These mismatched bounds are interestingly seen to be linked with an important robust statistical divergence family known as the relative -entropies; similar link is discussed between the optimum mismatched guessing with the extremes of these relative entropy measures.
Recommendations
Cites work
- scientific article; zbMATH DE number 3173999 (Why is no real title available?)
- scientific article; zbMATH DE number 6541785 (Why is no real title available?)
- A Mathematical Theory of Communication
- A Scale-Invariant Generalization of the Rényi Entropy, Associated Divergences and Their Optimizations Under Tsallis’ Nonextensive Framework
- A possible extension of Shannon's information theory
- An inequality on guessing and its application to sequential decoding
- Asymptotically scale-invariant occupancy of phase space makes the entropy S q extensive
- Complexity through nonextensivity
- Generalization of Shannon–Khinchin Axioms to Nonextensive Systems and the Uniqueness Theorem for the Nonextensive Entropy
- Guessing Under Source Uncertainty
- Information gain within nonextensive thermostatistics
- Introduction to Nonextensive Statistical Mechanics
- Minimization Problems Based on Relative <inline-formula> <tex-math notation="LaTeX">$\alpha $ </tex-math></inline-formula>-Entropy I: Forward Projection
- Nonextensive information theoretic kernels on measures
- On Information and Sufficiency
- Possible generalization of Boltzmann-Gibbs statistics.
- Source coding theorem based on a nonadditive information content
- Source coding with escort distributions and Rényi entropy bounds
- The logarithmic super divergence and asymptotic inference properties
Cited in
(3)
This page was built for publication: Optimal guessing under nonextensive framework and associated moment bounds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6165367)