An alternative to entropy in the measurement of information (Q1769764)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | An alternative to entropy in the measurement of information |
scientific article |
Statements
An alternative to entropy in the measurement of information (English)
0 references
22 March 2005
0 references
Summary: Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find a more general measure of information, but their outcomes were more of a formal, theoretical interest, and has not provided better insight into the nature of information. The strengths of entropy seemed so obvious that not much effort has been made to find an alternative to entropy which gives different values, but which is consistent with entropy in the sense that the results obtained in information theory thus far can be reproduced with the new measure. In this article the need for such an alternative measure is demonstrated based on a historical review of the problems with the conceptualization of information. Then, an alternative measure is presented in the context of a modified definition of information applicable outside of the conduit metaphor of Shannon's approach, and formulated without reference to uncertainty. It has several features superior to those of entropy. For instance, unlike entropy it can be easily and consistently extended to the continuous probability distributions, and unlike differential entropy this extension is always positive and invariant with respect to linear transformations of coordinates.
0 references
entropy
0 references
measures of information
0 references
information theory
0 references
semantics of information
0 references