Entropy and effective support size (Q925758): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Import241208061232 (talk | contribs)
Normalize DOI.
 
(3 intermediate revisions by 3 users not shown)
Property / DOI
 
Property / DOI: 10.3390/e8030169 / rank
Normal rank
 
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.3390/e8030169 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2076284810 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information Theory and Statistical Mechanics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3260837 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Mathematical Theory of Communication / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some observations on the concepts of information-theoretic entropy and randomness / rank
 
Normal rank
Property / cites work
 
Property / cites work: Rényi information, loglikelihood and an intrinsic distribution measure / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the entropy of continuous probability distributions (Corresp.) / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.3390/E8030169 / rank
 
Normal rank

Latest revision as of 08:19, 10 December 2024

scientific article
Language Label Description Also known as
English
Entropy and effective support size
scientific article

    Statements

    Entropy and effective support size (English)
    0 references
    0 references
    22 May 2008
    0 references
    Summary: Notion of Effective size of support (Ess) of a random variable is introduced. A small set of natural requirements that a measure of Ess should satisfy is presented. The measure with prescribed properties is in a direct (exp-) relationship to the family of Rényi's - entropies which includes also Shannon's entropy \(H\). Considerations of choice of the value of imply that exp(\(H\)) appears to be the most appropriate measure of Ess. Entropy and Ess can be viewed thanks to their log/exp relationship as two aspects of the same thing. In Probability and Statistics the Ess aspect could appear more basic than the entropic one.
    0 references
    Rényi's entropy
    0 references
    Shannon's entropy
    0 references
    support
    0 references
    interpretation
    0 references
    probability
    0 references
    statistics
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references