The principle of maximum entropy (Q1057831): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / Wikidata QID
 
Property / Wikidata QID: Q56157778 / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Equivalence of Gauss's Principle and Minimum Discrimination Information Estimation of Probabilities / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4158954 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3341941 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On The Principle of Minimum interdependence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information Theory and Statistical Mechanics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3260837 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3260839 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Maximum entropy interpretation of autoregressive spectral densities / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Mathematical Theory of Communication / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5606390 / rank
 
Normal rank

Latest revision as of 17:36, 14 June 2024

scientific article
Language Label Description Also known as
English
The principle of maximum entropy
scientific article

    Statements

    The principle of maximum entropy (English)
    0 references
    0 references
    0 references
    0 references
    1985
    0 references
    The authors point out that the ''principle of maximum entropy'' can be considered as a variational principle which has applications in statistical mechanics, in decision theory, in pattern-recognition and in time-series analysis. They explain this principle as follows: From the set of all probability distributions (for instance, the possible microscopic states of a system) compatible with one or several mean values of one or several random variables (for instance, the macroscopic energy that is the mean value of the random variable energy which is associated to each microscopic state) choose the one that maximizes the Shannon entropy. Especially, in the case of a discrete random variable whose mean value is given, the authors give the connections of the principle of maximum entropy with Gibbs (canonical) distribution and Laplace's principle of insufficient reason.
    0 references
    0 references
    0 references
    0 references
    0 references
    canonical distribution
    0 references
    principle of maximum entropy
    0 references
    variational principle
    0 references
    Shannon entropy
    0 references
    Laplace's principle of insufficient reason
    0 references
    0 references