Entropy and divergence associated with power function and the statistical application (Q653348)

From MaRDI portal
Revision as of 08:52, 30 January 2024 by Import240129110113 (talk | contribs) (Added link to MaRDI item.)
scientific article
Language Label Description Also known as
English
Entropy and divergence associated with power function and the statistical application
scientific article

    Statements

    Entropy and divergence associated with power function and the statistical application (English)
    0 references
    0 references
    0 references
    0 references
    9 January 2012
    0 references
    Summary: In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method.
    0 references
    Tsallis entropy
    0 references
    projective power divergence
    0 references
    robustness
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references