Entropy and divergence associated with power function and the statistical application (Q653348): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.3390/e12020262 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1979804848 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Asymptotic efficiency of statistical estimators: concepts and higher order asymptotic efficiency / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5538617 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust estimation in the normal mixture model / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust Blind Source Separation by Beta Divergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Exploring Latent Structure of Mixture ICA Models by the Minimum <i>β</i>-Divergence Method / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class of logistic-type discriminant functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust Loss Functions for Boosting / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information Geometry of U-Boost and Bregman Divergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robustifying AdaBoost by Adding the Naive Error Rate / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust Boosting Algorithm Against Mislabeling in Multiclass Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3603775 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust and efficient estimation by minimising a density power divergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Note on the Consistency of the Maximum Likelihood Estimate / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust parameter estimation with a small bias against heavy contamination / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Class of Local Likelihood Methods and Near-Parametric Asymptotics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust extraction of local structures by the minimum \(\beta\)-divergence method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors) / rank
 
Normal rank
Property / cites work
 
Property / cites work: The elements of statistical learning. Data mining, inference, and prediction / rank
 
Normal rank
Property / cites work
 
Property / cites work: Boosting the margin: a new explanation for the effectiveness of voting methods / rank
 
Normal rank

Latest revision as of 20:13, 4 July 2024

scientific article
Language Label Description Also known as
English
Entropy and divergence associated with power function and the statistical application
scientific article

    Statements

    Entropy and divergence associated with power function and the statistical application (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    9 January 2012
    0 references
    Summary: In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method.
    0 references
    0 references
    0 references
    0 references
    0 references
    Tsallis entropy
    0 references
    projective power divergence
    0 references
    robustness
    0 references
    0 references