Entropic nonextensivity: A possible measure of complexity (Q5954860): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: The thermodynamics of phase equilibrium / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4002244 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5765719 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Possible generalization of Boltzmann-Gibbs statistics. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4392284 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalization of Shannon’s theorem for Tsallis entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Axioms and uniqueness theorem for Tsallis entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5812325 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5606305 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4186205 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5831515 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5564905 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5589008 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalized information functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4002820 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Dynamic Linear Response Theory for a Nonextensive System Based on the Tsallis Prescription / rank
 
Normal rank
Property / cites work
 
Property / cites work: A statistical mechanical approach to generalized statistics of quantum and classical gases / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonuniqueness of canonical ensemble theory arising from microcanonical basis / rank
 
Normal rank
Property / cites work
 
Property / cites work: A quantitative test of Gibbs' statistical mechanics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Non-equilibrium thermodynamics and anomalous diffusion / rank
 
Normal rank
Property / cites work
 
Property / cites work: Itō-Langevin equations within generalized thermostatistics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Impacts of noise on a field theoretical model of the human brain. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3836832 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Power-law sensitivity to initial conditions---new entropic representation / rank
 
Normal rank
Property / cites work
 
Property / cites work: The rate of entropy increase at the edge of chaos / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Introduction to Chaos in Nonequilibrium Statistical Mechanics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Classical spin systems with long-range interactions: universal reduction of mixing / rank
 
Normal rank
Property / cites work
 
Property / cites work: Dynamical quasi-stationary states in a system with long-range forces / rank
 
Normal rank
Property / cites work
 
Property / cites work: Maximum entropy versus minimum enstrophy vortices / rank
 
Normal rank
Property / cites work
 
Property / cites work: Tsallis statistics and turbulence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimality, entropy and complexity for nonextensive quantum scattering / rank
 
Normal rank
Property / cites work
 
Property / cites work: Non-extensive statistics and solar neutrinos / rank
 
Normal rank
Property / cites work
 
Property / cites work: Classical and quantum non-extensive statistics effects in nuclear many-body problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of simulated annealing using the generalized transition probability / rank
 
Normal rank
Property / cites work
 
Property / cites work: A simple approach to time-inhomogeneous dynamics and applications to (fast) simulated annealing / rank
 
Normal rank
Property / cites work
 
Property / cites work: The renormalization group and optimization of entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Scale-invariant random-walks and optimization of non-extensive entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonadditive conditional entropy and its significance for local realism / rank
 
Normal rank

Revision as of 21:39, 3 June 2024

scientific article; zbMATH DE number 1702123
Language Label Description Also known as
English
Entropic nonextensivity: A possible measure of complexity
scientific article; zbMATH DE number 1702123

    Statements

    Entropic nonextensivity: A possible measure of complexity (English)
    0 references
    5 June 2003
    0 references
    The notion of entropy originated in thermodynamics, to distinguish between reversible and irreversible processes. Boltzmann's statistical physics enabled us to describe entropy in probabilistic terms, as \(-\sum p_i\cdot\log(p_i)\), where \(p_i\) are probabilities of different micro-states \(i\). Since Boltzmann, this (and related) definitions have been thoroughly experimentally confirmed -- and has led to many useful physical applications. Since 1940s, the notion of entropy has also been used in information theory and data processing; in particular, a well-known maximum entropy approach enables us to reconstruct the unknown distribution from results of incomplete measurements -- as the distribution with the largest entropy among all distribution that are consistent with the given measurement results. This approach has been applied to areas ranging from image processing to physics proper. In some cases, even better results can be obtained if, instead of the traditional entropy function, we maximize a ``generalized'' entropy \(S_q=-\sum p_i^q\). When \(q=1+\varepsilon\) and \(\varepsilon\to 0\), then, due to \(\sum p_i=1\), we have \(S_q\approx -1-\varepsilon\cdot \sum p_i\cdot \log(p_i)\), so the results of maximizing \(S_q\) are close to the results of maximizing traditional entropy. The author has discovered several physical situations in which maximizing generalized entropy describes the actual probabilities better than maximizing the classical entropy. The author's conclusion is that for different systems, physical entropy is indeed described by formulas with different \(q\): the more complex the system, the larger \(q\). Thus, the difference \(q-1\) can serve as a measure of the system's complexity. The author also extends several formulas from traditional statistical physics to the case of generalized entropy.
    0 references
    generalized entropy
    0 references
    statistical physics
    0 references
    information theory
    0 references
    data processing
    0 references
    incomplete measurements
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references