A mixed theory of information. X: Information functions and information measures (Q1094387)

From MaRDI portal
scientific article
Language Label Description Also known as
English
A mixed theory of information. X: Information functions and information measures
scientific article

    Statements

    A mixed theory of information. X: Information functions and information measures (English)
    0 references
    0 references
    1987
    0 references
    [For Part IX by \textit{B. Ebanks} and \textit{W. Sander} see Util. Math. 30, 63-78 (1986; Zbl 0576.94010).] In a new, mixed theory of information the measures of information are assumed to depend on both the probabilities and the events. The author characterizes measures depending upon an n-tuple of events and upon a finite number of n-ary complete discrete probability distributions by a recursivity and a weak symmetry condition. The result contains a special case many characterization theorems for some well-known information measures like the Shannon entropy, the entropy of degree \(\alpha\), the inaccuracy, the directed divergence and the information improvement. There is a generalization of this result found by \textit{B. Ebanks}, \textit{P. L. Kannappan} and \textit{C. T. Ng} [Recursive inset entropies of multiplicative type on open domains; Manuscript]. For a survey of all important results about weakly symmetric and generalized recursive information measures we refer to a survey article of the present author [The fundamental equation of information and its generalizations, Aequationes Math. 33, 150-182 (1987)].
    0 references
    inset measures
    0 references
    mixed theory of information
    0 references
    measures of information
    0 references
    recursivity
    0 references
    symmetry
    0 references
    Shannon entropy
    0 references
    entropy of degree \(\alpha \)
    0 references
    inaccuracy
    0 references
    directed divergence
    0 references
    information improvement
    0 references

    Identifiers