Why the Shannon and Hartley entropies are ‘natural’
From MaRDI portal
Publication:4404131
DOI10.2307/1426210zbMath0277.94011OpenAlexW2327198916MaRDI QIDQ4404131
János Aczél, Bruno Forte, Che Tat Ng
Publication date: 1974
Published in: Advances in Applied Probability (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.2307/1426210
Inequalities for sums, series and integrals (26D15) Convexity of real functions in one variable, generalizations (26A51) Information theory (general) (94A15) Functional equations and inequalities (39B99)
Related Items
MEASURES OF UNCERTAINTY AND INFORMATION BASED ON POSSIBILITY DISTRIBUTIONS ⋮ Where do we stand on measures of uncertainty, ambiguity, fuzziness, and the like? ⋮ Uniqueness of information measure in the theory of evidence ⋮ A Medida de Informação de Shannon: Entropia ⋮ The fundamental equation of information and its generalizations ⋮ Characterizations of a discrete normal distribution ⋮ Unnamed Item ⋮ A characterization of the Segal entropy ⋮ Equivalence of partition functions leads to classification of entropies and means ⋮ On some generalized functional equations in information theory ⋮ Tribute to a distinguished Professor János Aczél at 85 ⋮ Informational separability and entropy ⋮ Entropic mobility index as a measure of (in)equality of opportunity ⋮ Characterizing entropy in statistical physics and in quantum information theory ⋮ The many facets of entropy ⋮ A Comparative Assessment of Various Measures of Entropy ⋮ Measures of uncertainty for imprecise probabilities: an axiomatic approach ⋮ A unique characterization of the generalized Boltzmann-Gibbs-Shannon entropy ⋮ Revisiting prior distributions. II: Implications of the physical prior in maximum entropy analysis ⋮ Lectures on Entropy. I: Information-Theoretic Notions ⋮ Representing preorders with injective monotones ⋮ Measuring information beyond communication theory - why some generalized information measures may be useful, others not