Entropy and divergence associated with power function and the statistical application (Q653348)

From MaRDI portal





scientific article; zbMATH DE number 5995844
Language Label Description Also known as
default for all languages
No label defined
    English
    Entropy and divergence associated with power function and the statistical application
    scientific article; zbMATH DE number 5995844

      Statements

      Entropy and divergence associated with power function and the statistical application (English)
      0 references
      0 references
      0 references
      0 references
      9 January 2012
      0 references
      Summary: In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method.
      0 references
      Tsallis entropy
      0 references
      projective power divergence
      0 references
      robustness
      0 references

      Identifiers

      0 references
      0 references
      0 references
      0 references
      0 references
      0 references