On some entropy functionals derived from Rényi information divergence

From MaRDI portal
Publication:1031673

DOI10.1016/J.INS.2008.02.003zbMATH Open1172.94542arXiv0805.0129OpenAlexW1970605442MaRDI QIDQ1031673FDOQ1031673


Authors: J-F Bercher Edit this on Wikidata


Publication date: 30 October 2009

Published in: Information Sciences (Search for Journal in Brave)

Abstract: We consider the maximum entropy problems associated with R'enyi Q-entropy, subject to two kinds of constraints on expected values. The constraints considered are a constraint on the standard expectation, and a constraint on the generalized expectation as encountered in nonextensive statistics. The optimum maximum entropy probability distributions, which can exhibit a power-law behaviour, are derived and characterized. The R'enyi entropy of the optimum distributions can be viewed as a function of the constraint. This defines two families of entropy functionals in the space of possible expected values. General properties of these functionals, including nonnegativity, minimum, convexity, are documented. Their relationships as well as numerical aspects are also discussed. Finally, we work out some specific cases for the reference measure Q(x) and recover in a limit case some well-known entropies.


Full work available at URL: https://arxiv.org/abs/0805.0129




Recommendations




Cites Work


Cited In (20)





This page was built for publication: On some entropy functionals derived from Rényi information divergence

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1031673)