Entropy and divergence associated with power function and the statistical application (Q653348): Difference between revisions
From MaRDI portal
Added link to MaRDI item. |
Set profile property. |
||
Property / MaRDI profile type | |||
Property / MaRDI profile type: MaRDI publication profile / rank | |||
Normal rank |
Revision as of 00:54, 5 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Entropy and divergence associated with power function and the statistical application |
scientific article |
Statements
Entropy and divergence associated with power function and the statistical application (English)
0 references
9 January 2012
0 references
Summary: In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method.
0 references
Tsallis entropy
0 references
projective power divergence
0 references
robustness
0 references