Axiomatic characterizations of information measures (Q845363)

From MaRDI portal
Revision as of 01:05, 20 March 2024 by Openalex240319060354 (talk | contribs) (Set OpenAlex properties.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
Axiomatic characterizations of information measures
scientific article

    Statements

    Axiomatic characterizations of information measures (English)
    0 references
    0 references
    0 references
    29 January 2010
    0 references
    Summary: Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of \(\{1;\dots,N\}\) representable by joint entropies of components of an \(N\)-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.
    0 references
    Shannon entropy
    0 references
    Kullback I-divergence
    0 references
    Rényi information measures
    0 references
    \(f\)- divergence
    0 references
    \(f\)-entropy
    0 references
    functional equation
    0 references
    proper score
    0 references
    maximum entropy
    0 references
    transitive inference rule
    0 references
    Bregman distance
    0 references

    Identifiers