Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators
From MaRDI portal
(Redirected from Publication:434565)
Recommendations
- Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families
- Estimators for the binomial distribution that dominate the MLE in terms of Kullback-Leibler risk
- Parameter estimation based on cumulative Kullback-Leibler divergence
- scientific article; zbMATH DE number 3930880
- The Kullback-Leibler risk of the Stein estimator and the conditional MLE
Cites work
- scientific article; zbMATH DE number 3761167 (Why is no real title available?)
- scientific article; zbMATH DE number 44577 (Why is no real title available?)
- scientific article; zbMATH DE number 48436 (Why is no real title available?)
- scientific article; zbMATH DE number 1055955 (Why is no real title available?)
- scientific article; zbMATH DE number 1158743 (Why is no real title available?)
- scientific article; zbMATH DE number 3441432 (Why is no real title available?)
- scientific article; zbMATH DE number 3241743 (Why is no real title available?)
- scientific article; zbMATH DE number 3274494 (Why is no real title available?)
- Approximation of density functions by sequences of exponential families
- Asymptotical improvement of maximum likelihood estimators on Kullback-Leibler loss
- Decomposing posterior variance
- Desiderata for a predictive theory of statistics
- Differential-geometrical methods in statistics.
- Distribution estimation consistent in total variation and in two types of information divergence
- I-divergence geometry of probability distributions and minimization problems
- Intrinsic analysis of statistical estimation
- On Kullback-Leibler loss and density estimation
- Recursive nonlinear estimation. A geometric approach
- Sanov property, generalized I-projection and a conditional limit theorem
- Simultaneous estimation of the hardy-weinberg proportions
- The Kullback-Leibler risk of the Stein estimator and the conditional MLE
Cited in
(7)- Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families
- The Kullback-Leibler risk of the Stein estimator and the conditional MLE
- Lower bounds for the trade-off between bias and mean absolute deviation
- Estimators for the binomial distribution that dominate the MLE in terms of Kullback-Leibler risk
- Letter to the Editor: Zhang, J. (2021), “The Mean Relative Entropy: An Invariant Measure of Estimation Error,” The American Statistician, 75, 117–123: comment by Vos and Wu
- Generalized estimators, slope, efficiency, and Fisher information bounds
- Using geometry to select one dimensional exponential families that are monotone likelihood ratio in the sample space, are weakly unimodal and can be parametrized by a measure of central tendency
This page was built for publication: Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q434565)