Papers on probability, statistics and statistical physics. Ed. by R. D. Rosenkrantz. (Q1187726)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Papers on probability, statistics and statistical physics. Ed. by R. D. Rosenkrantz.
scientific article

    Statements

    Papers on probability, statistics and statistical physics. Ed. by R. D. Rosenkrantz. (English)
    0 references
    0 references
    17 September 1992
    0 references
    Statistical physics is widely recognized as one of the most beautiful branches of physics. This is due to its physical depth, astonishing universality as well as mathematical elegance. However, in spite of numerous efforts its foundations remain somewhat obscure and constitute the subject of everlasting discussions. The main theme is to what extent the macroscopic irreversible behaviour is derivable from the underlying reversible dynamics. The book under consideration is the collection, in chronological order, of the papers by E. T. Jaynes and concerning his ``predictive'' approach to statistical mechanics. It contains two papers, reprinted from Physical Reviews, concerning the connection between statistical mechanics and information theory, the 1962 Brandeis Summer Lectures, lectures given at ``Delaware Seminar on the Foundations of Physics'' as well as papers devoted to the entropy problem, prior probabilities and, generally speaking, to the rules of statistical inference. One attempt to derive the irreversibility of the macroscopic world from reversible microdynamics is related to the mathematical properties of the latter such as ergodicity, metrical transitivity and mixing. This approach, although very appealing, faces severe difficulties: the proof of ergodicity has been given only for some very simple systems, the role of the size of the system in passing from infinite time averages to finite ones is unclear, etc. Moreover, the situation is complicated due to the KAM theorem as well as due to the existence of nontrivial, completely integrable systems. The author proposes a very different point of view. In his approach the role of dynamics is reduced to the, roughly speaking, ``enumeration of possibilities'', the main emphasis being put on the peculiarities of statistical inference. He adopts the ``subjective'' interpretation of the probability theory and proposes the statistical ensemble to represent the state of knowledge about the physical system. The main problem is then what probability assignments to microstates correctly describe the state of knowledge we have about a given physical system. This is solved by appealing to the Shannon's interpretation that the information measure for any probability distribution \(p_ i\) is the entropy \(H=-\sum p_ i\log p_ i.\) The author argues that, in spite of the fact that we are dealing with the measure of our knowledge, we can still consider the entropy as ``objective'' quantity that can be measured in laboratory. This main idea, its consequences and related topics are discussed in great detail in the author's papers. The work of E. T. Jaynes contains a wealth of interesting ideas and remarks and constitutes the original approach to the foundations of statistical mechanics. However, the strong reviewer's belief is that this is not the whole story and that the ``ergodic'' approach also contains a pretty large amount of truth.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    Gibbs and Boltzman entropy
    0 references
    prior probabilities
    0 references
    macroscopic irreversible behaviour
    0 references
    reversible dynamics
    0 references
    statistical mechanics
    0 references
    connection between statistical mechanics and information theory
    0 references
    metrical transitivity
    0 references