Entropy and information. Translated by Abe Shenitzer and Robert G. Bruns. (Q1012744): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
 
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/978-3-0346-0078-1 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W4290972066 / rank
 
Normal rank

Latest revision as of 20:40, 19 March 2024

scientific article
Language Label Description Also known as
English
Entropy and information. Translated by Abe Shenitzer and Robert G. Bruns.
scientific article

    Statements

    Entropy and information. Translated by Abe Shenitzer and Robert G. Bruns. (English)
    0 references
    22 April 2009
    0 references
    The book is structured in eight chapters covering topics ranging from the early history of thermodynamics to the complexity and value of information. The book is recommended to advanced high-school students, beginning university students and others who do not know many concepts of physics. Being a book for non-experts, the author describes all the notions using very simple mathematical equations avoiding difficult proofs. The book, also, contains schemes and images for a better understanding of the reader. The first chapter presents the heat phenomena starting from Sadi Carnot's paper ``Reflections on the motive power of fire and on machines capable of developing that power''. The heat is presented through the caloric concept, ``a weightless, invisible fluid that when added to a body caused its temperature to rise and was capable of changing its state''. To emphasize the dissimilarity of heat and mechanical phenomena the author presents the irreversibility of heat processes opposed to those of ordinary mechanics. The concept of thermodynamics is described through the Carnot cycle and defined as general laws that determine the mutual connections between the physical quantities characterizing all those processes occurring in nature and technology, by means of which energy is transferred from one body to another and transformed from one form to another. Chapter 2 introduces the laws of thermodynamics -- the first law (Mayer, Joule, Helmholtz, Carnot), the second law (Carnot, Clausius), the third law (Nernst) -- which lead to the concept to which the book is devoted: entropy. After introducing the logarithm and exponential functions the author presents examples on how to calculate and measure the entropy practically. The contact between entropy and energy is emphasized in the third chapter by presenting chemical reactions. All these reactions end with conclusions regarding the importance of entropy. In the end the two concepts are presented through Emden ideas which state that the greater the heat absorbed (the greater the increase in entropy) the less the energy available for doing ``useful work''. Chapter 4 emphasizes the contact between entropy and probability. Boltzmann's formula demonstrates the fact that entropy increases in spontaneously evolving processes. The chapter also contains an introduction to the fusion of a crystal and the evaporation of a liquid, phenomena that are further described in the next chapter. The author also presents how entropic forces arise and expresses statistically the fact that an isolated system in a state of equilibrium has maximum entropy. At the end of this chapter there is an introduction to quantum mechanics including Gibb's paradox which states that ``in a continuous transition from gases differing from one another less and less to gases that are absolutely identical, there is a jump in the behaviour of the change in entropy''. Chapter 5 is in fact a continuation of the previous chapter containing the link between statistics and mechanics. The author starts with the Boltzmann's law and presents some important consequences. He also discusses the special case of the distribution of the molecules of a gas by energy in a gravitational field based on the barometric formula. The second half of this chapter begins with the term of fluctuation and its importance in every day life. The term of devations is described using Darwin's ideas about evolution. The chapter ends with Laplace's and Sinal's ideas on how physics influences the universe. Chapter 6 introduces entropy for open systems - systems that interchange matter and/or energy with the surrounding world. The author describes certain aspects of the linear thermodynamics of open systems, valid close to equilibrium, in particular the possibility of processes involving a loss in entropy as a consequence of the interconnection with other processes. Also, processes are presented that are ``far from equilibrium'' such as the development of an embryo which implies that the cell-division and resulting growth of an organism are directly related to the efflux of entropy into the surrounding medium. The last two chapters present the importance of entropy in communication processes. An example is described to demonstrate that a linguistic text constitutes a Markov chain. The author explains the impossibility to obtain information about an isolated adiabatic system because of the fact that any instrument brought into contact with the system violates its isolation. So, obtaining information about one part of an open system increases the entropy of some other part of the system. At the end the author emphasizes the role of entropy and information in every day life and their impact on living organisms. The book explains in detail all the presented formulas and phenomena. All the examples are described through schemes and images trying to provide an easier way to understand the notions.
    0 references
    physics
    0 references
    entropy
    0 references
    thermodynamics
    0 references
    information
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references