Entropy and divergence associated with power function and the statistical application
From MaRDI portal
Publication:653348
DOI10.3390/e12020262zbMath1229.82010OpenAlexW1979804848MaRDI QIDQ653348
Publication date: 9 January 2012
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e12020262
Foundations of equilibrium statistical mechanics (82B03) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
Statistical analysis of distance estimators with density differences and density ratios ⋮ Duality of maximum entropy and minimum divergence ⋮ Least informative distributions in maximum \(q\)-log-likelihood estimation ⋮ Exponentiality test based on alpha-divergence and gamma-divergence ⋮ Projective power entropy and maximum Tsallis entropy distributions ⋮ Minimum information divergence of Q-functions for dynamic treatment resumes ⋮ Normalized estimating equation for robust parameter estimation ⋮ Spontaneous Clustering via Minimum Gamma-Divergence ⋮ Cramér-Rao lower bounds arising from generalized Csiszár divergences ⋮ Entropic risk minimization for nonparametric estimation of mixing distributions
Cites Work
- Unnamed Item
- Unnamed Item
- Robust parameter estimation with a small bias against heavy contamination
- Asymptotic efficiency of statistical estimators: concepts and higher order asymptotic efficiency
- Robust extraction of local structures by the minimum \(\beta\)-divergence method
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Robust estimation in the normal mixture model
- Robust Blind Source Separation by Beta Divergence
- Exploring Latent Structure of Mixture ICA Models by the Minimum β-Divergence Method
- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- A Class of Local Likelihood Methods and Near-Parametric Asymptotics
- Robust and efficient estimation by minimising a density power divergence
- A class of logistic-type discriminant functions
- Robustifying AdaBoost by Adding the Naive Error Rate
- Information Geometry of U-Boost and Bregman Divergence
- Robust Loss Functions for Boosting
- Note on the Consistency of the Maximum Likelihood Estimate
- The elements of statistical learning. Data mining, inference, and prediction