Information entropy in problems of classical and quantum statistical mechanics (Q610536)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Information entropy in problems of classical and quantum statistical mechanics
scientific article

    Statements

    Information entropy in problems of classical and quantum statistical mechanics (English)
    0 references
    0 references
    0 references
    8 December 2010
    0 references
    In this short paper we have the considerations about a generalization of information entropy with respect to a set of \(F\) of observables (the \(F\)-entropy). Its special cases are called the structurally stable entropy [cf. e.g. \textit{H. Poincaré}, Selected works in three volumes. Volume III: Mathematics. Theoretical physics. Analysis of the works of Henri Poincaré on mathematics and the natural sciences. Moskva: Izdat. Nauka. 771 p. R. 4.00 (1974; Zbl 0526.01033)] or the entropy of a macroscopic state (used by von Neumann). Taking into account that it is generally accepted that the information entropy of a Gibbs state of both quantum and classical systems coincides with the thermodynamic entropy of an equilibrium state, the \(F\)-entropy of a depending on time state of the system is defined as the information entropy of some ensemble similar to the microcanonical state. But the \(F\)-entropy is defined for any state of the system, which may not be quasi-equilibrium one. The main purpose in this paper is the study of conditions for the \(F\)-entropy of a quantum system to stabilize with increasing time. The paper consists of four sections. The first one describes the definitions and shows the preliminaries in the case of both the quantum and classical cases. It starts with the asumption that each state of the system is determined by a Borel probability measure on the separable complex Hilbert space of a system. If it is assumed that \(\nu\) is a state of the quantum system and \(A\) is a bounded quantum observable, then \(\int(Ax,x)\nu(dx)=\)tr\(AT\), where \(T\) is the correlation operator of the measure \(\nu\) defined as \(T=\int(x\otimes x)\nu(dx)\). Moreover, if \(\nu\) and \(f\) are a state and an observable of the Hamiltonian system, then \(\mu=\int f(x)\nu(dx)\) and \(\int(Ax,x)\nu(dx)=\mu\). If \(\nu\) is a probability and together with \(\mu\) they are measures on measurable space with \(\nu(\Omega)=1\) then the information entropy is defined as \(S(\nu,\mu)=-\int \frac{d\nu}{d\mu}\ln\frac{d\nu}{d\mu}d\mu\). When \(\mu\) is a counting measure, then the information entropy \(S(\nu)=S(\nu,\mu)\). The second Section gives the definition of \(F\)-entropy (denoted by \(S_F\), the third one a possible use of the notations and assumptions introduced in previous sections. The last section gives the scheme of two proofs given in the third Section. Reviewer's remark: Because in this paper the numbers of equations were not used sometimes it is hard to find an appropriate reference to exact equation given in the text.
    0 references
    B-G-S information entropy
    0 references
    \(F\)-entropy
    0 references
    quantum systems
    0 references
    statistical mechanics
    0 references
    non-equilibrium systems
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references