Entropy and divergence associated with power function and the statistical application
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 3251902 (Why is no real title available?)
- A Class of Local Likelihood Methods and Near-Parametric Asymptotics
- A class of logistic-type discriminant functions
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Asymptotic efficiency of statistical estimators: concepts and higher order asymptotic efficiency
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Exploring Latent Structure of Mixture ICA Models by the Minimum β-Divergence Method
- Information Geometry of U-Boost and Bregman Divergence
- Information geometry and statistical pattern recognition
- Note on the Consistency of the Maximum Likelihood Estimate
- Robust Blind Source Separation by Beta Divergence
- Robust Boosting Algorithm Against Mislabeling in Multiclass Problems
- Robust Loss Functions for Boosting
- Robust and efficient estimation by minimising a density power divergence
- Robust estimation in the normal mixture model
- Robust extraction of local structures by the minimum \(\beta\)-divergence method
- Robust parameter estimation with a small bias against heavy contamination
- Robustifying AdaBoost by Adding the Naive Error Rate
- The elements of statistical learning. Data mining, inference, and prediction
Cited in
(12)- Projective power entropy and maximum Tsallis entropy distributions
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Determination of a power density by an entropy regularization method
- Testing bivariate independence based on α -divergence by improved probit transformation method for copula density estimation
- Entropic risk minimization for nonparametric estimation of mixing distributions
- Least informative distributions in maximum \(q\)-log-likelihood estimation
- Minimum information divergence of Q-functions for dynamic treatment resumes
- Spontaneous clustering via minimum gamma-divergence
- Duality of maximum entropy and minimum divergence
- Statistical analysis of distance estimators with density differences and density ratios
- Exponentiality test based on alpha-divergence and gamma-divergence
- Normalized estimating equation for robust parameter estimation
This page was built for publication: Entropy and divergence associated with power function and the statistical application
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q653348)