Is the entropy \(S_q\) extensive or nonextensive? (Q863250)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Is the entropy \(S_q\) extensive or nonextensive?
scientific article

    Statements

    Is the entropy \(S_q\) extensive or nonextensive? (English)
    0 references
    25 January 2007
    0 references
    The author discusses the extensive properties of entropies according to the definitions of entropies \(S_{BG}\) and \(S_q\), and concludes that the entropy \(S_q\) is nonextensive sometimes different from the entropy \(S_{BG}\). A key point of this issue is the definition of the entropy \(S_q\), which results from the nonextensive statistical mechanics developed by the author and his cooperators. We wonder why the author defines the entropy \(S_q\), which is clearly different from the definition of the entropy \(S_{BG}\). It is well-known when Clausius introduced the concept of entropy, he considered it a function of a state for a system, such that the entropy is an extensive variable for thermodynamics. It is not meaningful to define a so-called entropy apart from a unique notion that an entropy is a function of a state. An additivity of entropies merely associates with that of states, and a product of probabilities or a sum of probabilities has no direct relation with the additivity of states. Like a product of energies, there only exists an energy of two interacting states, there is no a product of the energies of these two states. There must not be any product of entropies, which conflicts with an essential meaning of the entropy definition. A product of entropies has not any physical meaning. According to statistical mechanics, in a mixed ensemble a distributing function of a system is described by a density matrix, which elements \(\omega_{nn}\) are diagonal for all subsystems. Since this statement involves neglecting the interactions between subsystems, it is more precise to say that the non-diagonal elements \(\omega_{mn}\) tend to zero as the relative importance of these interactions decreases, and therefore as the number of particles in the systems increases. Thus, the mean value of any quantity becomes simply \[ \overline f=\sum_n\omega_nf_{nn}\tag{1} \] where \(\omega_n=\omega_{nn}\), and there are only the diagonal matrix elements \(f_{nn}\). By the subsystems' independence the logarithm of the distributing function for subsystems must be of the form \[ Ln\;\omega_n^{(a)}=\alpha^{(a)}+\beta E_n^{(a)}\tag{2} \] where the letter \(a\) represents the \(a\)-th subsystem, \(\alpha\) and \(\beta\) are two constants, respectively. Thus the probabilities \(\omega_n\) can be expressed as a function of the energy level alone: \(\omega_n=\omega(E_n)\). Generally, (2) can be rewritten as \[ Ln\;\omega(E_n)=\alpha+\beta E_n.\tag{3} \] Since this expression is linear in \(E_n\), we then have mean function \[ Ln\;\omega(\overline E)=\alpha+\beta\overline E=\overline{Ln\;\omega(E_n)}.\tag{4} \] The entropy can therefore be written as \[ S=-\overline{Ln\;\omega(E_n)},\tag{5} \] i.e., the entropy can be defined as minus the mean logrithm of the distribution function of the system. By the significance of the mean value, we have \[ S=-\sum_n\omega_nLn\;\omega_n.\tag{6} \] In quantum mechanics this expression can be written in a general operator form independent of the choice of the set of wave functions with respect to which the statistical matrix elements are defined as \[ S=- \text{Tr}(\widehat\omega Ln\;\widehat\omega),\tag{7} \] where the operator \(Ln\;\widehat\omega\) must be understood as an operator whose eigenvalues equal the logarithms of the eigenvalues of the operator \(\widehat\omega\), and whose eigenfunctions are the same as those of \(\widehat\omega\). Even if a system is in a nonequilibrium state, the system entropy is also concerned with its stability and all of local equilibrium states in which its subsystems exist, and the system entropy simply equals to the sum of the entropies of those subsystems. Information theory successively developed outside the domain of communication theory with significant contributions to statistical mechanics, computer science, probability theory and statistics. The entropy, the central concept of this theory, was carried over from thermodynamics. In each setting, the entropy is a measure of randomness or unpredictability. Consider a source emitting a string \(S\) of symbols \(s_i\) draw from an alphabet \(A=\{1,2,\dots,b\}\) with a given probability \(P(s), \forall s\in A\). If \(P(s)=1\) for some \(s\), the outcome of the experiment is univocal and the source conveys no information. If, on the contrary, \(P(s)=\frac1b, \forall s\), the uncertainty is maximal (provided that the source has no memory). The average information over all outcomes \(s_i\in A\) of an experiment is conveniently measured by the entropy \[ H(\Lambda)=-\sum_{i=1}^b P(s_i) Ln\;P(s_i),\tag{8} \] where \(P=\{P(1),\dots,P(b)\}\) is the probability distribution for \(\Lambda\). From the above (5), (7) and (8) we see that there not only exist probabilities, but also mean values, such that the entropy will behave its additivity. In this sense, the definition of the entropy sounds reasonable. If in a composite system, which can be considered as a mixed ensemble, there exist some new states which do not independently belong to the original subsystems A or B, that is, there exist some non-diagonal elements in the density matrix, we may then divide again the composite system into some independent subsystems, of which each represents only one state, that is, we have made a transformation for the density matrix in order to eliminate all of non-diagonal elements. In general, each new subsystem is not equivalent to the original subsystems A or B. The additivity of these new states corresponds to that of entropies of these new subsystems. In a few words, to define an entropy by means of probabilities must coincide with the additivity of entropies, otherwise such a definition has not any physical significance, although one can define a so-called entropy at will.
    0 references
    third order
    0 references
    nonextensive statistical mechanics
    0 references
    entropy's extensive
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references