Measuring information beyond communication theory - why some generalized information measures may be useful, others not
DOI10.1007/BF02192655zbMATH Open0547.94004OpenAlexW2076054002MaRDI QIDQ799647FDOQ799647
Authors: János Aczél
Publication date: 1984
Published in: Aequationes Mathematicae (Search for Journal in Brave)
Full work available at URL: https://eudml.org/doc/137008
Recommendations
- Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations
- Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations
- scientific article; zbMATH DE number 3943704
- scientific article; zbMATH DE number 3869206
- scientific article; zbMATH DE number 3623432
Functional equations and inequalities (39B99) Research exposition (monographs, survey articles) pertaining to information and communication theory (94-02) Measures of information, entropy (94A17) Functional equations for functions with more general domains and/or ranges (39B52)
Cites Work
- Title not available (Why is that?)
- Generalized information functions
- On measures of information and their characterizations
- On the Foundations of Information Theory
- A coding theorem and Rényi's entropy
- Why the Shannon and Hartley entropies are ‘natural’
- Definition of entropy by means of a coding problem
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- Title not available (Why is that?)
- On the measurable solutions of a functional equation
- On the inequality \(\sum p_if(p_i) \geq \sum p_if(q_i)\).
- Title not available (Why is that?)
- A mixed theory of information. VIII: Inset measures depending upon several distributions
- A mixed theory of information. IV: Inset-inaccuracy and directed divergence
- A mixed theory of information. V: How to keep the (inset) expert honest
- On the inequality \(\sum^n_{i+1}p_i{f(p_i)\over f(q_i)}\geq 1\)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- The role of boundedness in characterizing shannon entropy
- Title not available (Why is that?)
- Entropies of degree β and lower bounds for the average error rate
- Title not available (Why is that?)
- Title not available (Why is that?)
- A mixed theory of information. III. Inset entropies of degree β
- Representation for measures of information with the branching property
- Title not available (Why is that?)
- Über Mittelwerte und Entropien Vollständiger Wahrscheinlichkeitsverteilungen
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the Inequality
Cited In (12)
- Title not available (Why is that?)
- Certainty equivalents and information measures: Duality and extremal principles
- Characterizations of sum form information measures on open domains
- Extension of some results for channel capacity using a generalized information measure
- Relative and discrete utility maximising entropy
- In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory
- Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations
- Title not available (Why is that?)
- The fundamental equation of information and its generalizations
- Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations
- A Medida de Informação de Shannon: Entropia
- Uncertainty measures, decomposability and admissibility
This page was built for publication: Measuring information beyond communication theory - why some generalized information measures may be useful, others not
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q799647)