Thermodynamics and concentration (Q418227): Difference between revisions
From MaRDI portal
Created a new Item |
Changed an Item |
||
Property / review text | |||
Concentration inequalities bound the probability that random quantities deviate from their average, medicine or otherwise typical value. They play an important role in the study of natural and artificial learning systems. The aim of this paper is to introduce the subadditivity of entropy as a unified basis for the derivation of concentration inequalities for functions on product spaces and to demonstrate the benefits of formulations for the concentration problem in the language of statistical thermodynamic. The author's method consists of three steps. In the first step, he expresses the log-Laplace transform in the terms of an integral of the thermal entropy over a range of inverse temperature. In the second step, he proves the tensorization inequality or, more precisely speaking, a thermal subadditivity property of entropy. It is proved that the entropy of a system is not greater than the thermal average of the sum of entropies of the constituent subsystems. In the third step, he expresses the entropy of the subsystem in terms of thermal energy fluctuations. At the end, he shows some applications of his method. | |||
Property / review text: Concentration inequalities bound the probability that random quantities deviate from their average, medicine or otherwise typical value. They play an important role in the study of natural and artificial learning systems. The aim of this paper is to introduce the subadditivity of entropy as a unified basis for the derivation of concentration inequalities for functions on product spaces and to demonstrate the benefits of formulations for the concentration problem in the language of statistical thermodynamic. The author's method consists of three steps. In the first step, he expresses the log-Laplace transform in the terms of an integral of the thermal entropy over a range of inverse temperature. In the second step, he proves the tensorization inequality or, more precisely speaking, a thermal subadditivity property of entropy. It is proved that the entropy of a system is not greater than the thermal average of the sum of entropies of the constituent subsystems. In the third step, he expresses the entropy of the subsystem in terms of thermal energy fluctuations. At the end, he shows some applications of his method. / rank | |||
Normal rank | |||
Property / reviewed by | |||
Property / reviewed by: Jerzy August Gawinecki / rank | |||
Normal rank | |||
Property / Mathematics Subject Classification ID | |||
Property / Mathematics Subject Classification ID: 60G07 / rank | |||
Normal rank | |||
Property / Mathematics Subject Classification ID | |||
Property / Mathematics Subject Classification ID: 80A17 / rank | |||
Normal rank | |||
Property / zbMATH DE Number | |||
Property / zbMATH DE Number: 6038702 / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
entropy method | |||
Property / zbMATH Keywords: entropy method / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
tail bounds | |||
Property / zbMATH Keywords: tail bounds / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
concentration | |||
Property / zbMATH Keywords: concentration / rank | |||
Normal rank |
Revision as of 21:05, 29 June 2023
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Thermodynamics and concentration |
scientific article |
Statements
Thermodynamics and concentration (English)
0 references
28 May 2012
0 references
Concentration inequalities bound the probability that random quantities deviate from their average, medicine or otherwise typical value. They play an important role in the study of natural and artificial learning systems. The aim of this paper is to introduce the subadditivity of entropy as a unified basis for the derivation of concentration inequalities for functions on product spaces and to demonstrate the benefits of formulations for the concentration problem in the language of statistical thermodynamic. The author's method consists of three steps. In the first step, he expresses the log-Laplace transform in the terms of an integral of the thermal entropy over a range of inverse temperature. In the second step, he proves the tensorization inequality or, more precisely speaking, a thermal subadditivity property of entropy. It is proved that the entropy of a system is not greater than the thermal average of the sum of entropies of the constituent subsystems. In the third step, he expresses the entropy of the subsystem in terms of thermal energy fluctuations. At the end, he shows some applications of his method.
0 references
entropy method
0 references
tail bounds
0 references
concentration
0 references