Measuring information beyond communication theory - why some generalized information measures may be useful, others not
From MaRDI portal
Publication:799647
DOI10.1007/BF02192655zbMath0547.94004OpenAlexW2076054002MaRDI QIDQ799647
Publication date: 1984
Published in: Aequationes Mathematicae (Search for Journal in Brave)
Full work available at URL: https://eudml.org/doc/137008
Research exposition (monographs, survey articles) pertaining to information and communication theory (94-02) Functional equations for functions with more general domains and/or ranges (39B52) Measures of information, entropy (94A17) Functional equations and inequalities (39B99)
Related Items
Extension of some results for channel capacity using a generalized information measure ⋮ Characterizations of sum form information measures on open domains ⋮ A Medida de Informação de Shannon: Entropia ⋮ The fundamental equation of information and its generalizations ⋮ Certainty equivalents and information measures: Duality and extremal principles ⋮ Relative and Discrete Utility Maximising Entropy ⋮ Uncertainty measures, decomposability and admissibility
Cites Work
- A mixed theory of information. VIII: Inset measures depending upon several distributions
- A mixed theory of information. IV: Inset-inaccuracy and directed divergence
- A mixed theory of information. V: How to keep the (inset) expert honest
- On the inequality \(\sum^n_{i+1}p_i{f(p_i)\over f(q_i)}\geq 1\)
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- On measures of information and their characterizations
- On the inequality \(\sum p_if(p_i) \geq \sum p_if(q_i)\).
- The role of boundedness in characterizing shannon entropy
- Entropies of degree β and lower bounds for the average error rate
- A mixed theory of information. III. Inset entropies of degree β
- Why the Shannon and Hartley entropies are ‘natural’
- Representation for measures of information with the branching property
- Über Mittelwerte und Entropien Vollständiger Wahrscheinlichkeitsverteilungen
- A coding theorem and Rényi's entropy
- On the Foundations of Information Theory
- Definition of entropy by means of a coding problem
- Generalized information functions
- On the measurable solutions of a functional equation
- On the Inequality
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item