Entropy and divergence associated with power function and the statistical application
From MaRDI portal
Publication:653348
DOI10.3390/E12020262zbMATH Open1229.82010OpenAlexW1979804848MaRDI QIDQ653348FDOQ653348
Publication date: 9 January 2012
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e12020262
Recommendations
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17) Foundations of equilibrium statistical mechanics (82B03)
Cites Work
- The elements of statistical learning. Data mining, inference, and prediction
- A class of logistic-type discriminant functions
- Title not available (Why is that?)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- A Class of Local Likelihood Methods and Near-Parametric Asymptotics
- Robust estimation in the normal mixture model
- Robust Blind Source Separation by Beta Divergence
- Robust and efficient estimation by minimising a density power divergence
- Note on the Consistency of the Maximum Likelihood Estimate
- Robust parameter estimation with a small bias against heavy contamination
- Asymptotic efficiency of statistical estimators: concepts and higher order asymptotic efficiency
- Information geometry and statistical pattern recognition
- Information Geometry of U-Boost and Bregman Divergence
- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- Robustifying AdaBoost by Adding the Naive Error Rate
- Robust extraction of local structures by the minimum \(\beta\)-divergence method
- Exploring Latent Structure of Mixture ICA Models by the Minimum β-Divergence Method
- Robust Loss Functions for Boosting
Cited In (12)
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Minimum information divergence of Q-functions for dynamic treatment resumes
- Entropic risk minimization for nonparametric estimation of mixing distributions
- Projective power entropy and maximum Tsallis entropy distributions
- Normalized estimating equation for robust parameter estimation
- Duality of maximum entropy and minimum divergence
- Statistical analysis of distance estimators with density differences and density ratios
- Determination of a power density by an entropy regularization method
- Least informative distributions in maximum \(q\)-log-likelihood estimation
- Spontaneous Clustering via Minimum Gamma-Divergence
- Exponentiality test based on alpha-divergence and gamma-divergence
- Testing bivariate independence based on α -divergence by improved probit transformation method for copula density estimation
This page was built for publication: Entropy and divergence associated with power function and the statistical application
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q653348)