Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations (Q5895463)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations |
scientific article; zbMATH DE number 3884086
| Language | Label | Description | Also known as |
|---|---|---|---|
| default for all languages | No label defined |
||
| English | Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations |
scientific article; zbMATH DE number 3884086 |
Statements
Measuring information beyond communication theory. Some probably useful and some almost certainly useless generalizations (English)
0 references
1984
0 references
[Correction of the incomplete review Zbl 0546.94004.] The author gives a survey on recent results, in which purely probabilistic information measures (like the Shannon entropy, the Rényi entropy, the entropies of higher degree, the directed divergence, the information improvement, etc.) are characterized by some naturally arising properties. Moreover he reports about results and applications of a new mixed theory of information, where the measures of information are allowed to depend both upon the probabilities and the events. Finally the author explains the ''usefulness'' of some considered information measures and presents many suggestions for further research in this topic of information theory.
0 references
probabilistic information measures
0 references
mixed theory of information
0 references
0.99999976
0 references
0.9813892
0 references
0.87498677
0 references