Entropic nonextensivity: A possible measure of complexity (Q5954860)

From MaRDI portal
scientific article; zbMATH DE number 1702123
Language Label Description Also known as
English
Entropic nonextensivity: A possible measure of complexity
scientific article; zbMATH DE number 1702123

    Statements

    Entropic nonextensivity: A possible measure of complexity (English)
    0 references
    5 June 2003
    0 references
    The notion of entropy originated in thermodynamics, to distinguish between reversible and irreversible processes. Boltzmann's statistical physics enabled us to describe entropy in probabilistic terms, as \(-\sum p_i\cdot\log(p_i)\), where \(p_i\) are probabilities of different micro-states \(i\). Since Boltzmann, this (and related) definitions have been thoroughly experimentally confirmed -- and has led to many useful physical applications. Since 1940s, the notion of entropy has also been used in information theory and data processing; in particular, a well-known maximum entropy approach enables us to reconstruct the unknown distribution from results of incomplete measurements -- as the distribution with the largest entropy among all distribution that are consistent with the given measurement results. This approach has been applied to areas ranging from image processing to physics proper. In some cases, even better results can be obtained if, instead of the traditional entropy function, we maximize a ``generalized'' entropy \(S_q=-\sum p_i^q\). When \(q=1+\varepsilon\) and \(\varepsilon\to 0\), then, due to \(\sum p_i=1\), we have \(S_q\approx -1-\varepsilon\cdot \sum p_i\cdot \log(p_i)\), so the results of maximizing \(S_q\) are close to the results of maximizing traditional entropy. The author has discovered several physical situations in which maximizing generalized entropy describes the actual probabilities better than maximizing the classical entropy. The author's conclusion is that for different systems, physical entropy is indeed described by formulas with different \(q\): the more complex the system, the larger \(q\). Thus, the difference \(q-1\) can serve as a measure of the system's complexity. The author also extends several formulas from traditional statistical physics to the case of generalized entropy.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    generalized entropy
    0 references
    statistical physics
    0 references
    information theory
    0 references
    data processing
    0 references
    incomplete measurements
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references