Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators
DOI10.1016/J.JSPI.2012.01.002zbMATH Open1242.62026OpenAlexW2005899734MaRDI QIDQ434565FDOQ434565
Publication date: 16 July 2012
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2012.01.002
Recommendations
- Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families
- Estimators for the binomial distribution that dominate the MLE in terms of Kullback-Leibler risk
- Parameter estimation based on cumulative Kullback-Leibler divergence
- scientific article; zbMATH DE number 3930880
- The Kullback-Leibler risk of the Stein estimator and the conditional MLE
\(\mathcal P\)-bias\(\mathcal P\)-variancedistribution unbiaseddual KL riskKL biasKL meanKL riskKL variance
Nonparametric estimation (62G05) Applications of statistics to biology and medical sciences; meta analysis (62P10) Genetics and epigenetics (92D10)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On Kullback-Leibler loss and density estimation
- Title not available (Why is that?)
- Title not available (Why is that?)
- I-divergence geometry of probability distributions and minimization problems
- Differential-geometrical methods in statistics.
- Approximation of density functions by sequences of exponential families
- Decomposing posterior variance
- Intrinsic analysis of statistical estimation
- Sanov property, generalized I-projection and a conditional limit theorem
- Title not available (Why is that?)
- Distribution estimation consistent in total variation and in two types of information divergence
- Asymptotical improvement of maximum likelihood estimators on Kullback-Leibler loss
- The Kullback-Leibler risk of the Stein estimator and the conditional MLE
- Recursive nonlinear estimation. A geometric approach
- Desiderata for a predictive theory of statistics
- Simultaneous estimation of the hardy-weinberg proportions
Cited In (7)
- Generalized estimators, slope, efficiency, and Fisher information bounds
- Estimators for the binomial distribution that dominate the MLE in terms of Kullback-Leibler risk
- Lower bounds for the trade-off between bias and mean absolute deviation
- Using geometry to select one dimensional exponential families that are monotone likelihood ratio in the sample space, are weakly unimodal and can be parametrized by a measure of central tendency
- Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families
- The Kullback-Leibler risk of the Stein estimator and the conditional MLE
- Letter to the Editor: Zhang, J. (2021), “The Mean Relative Entropy: An Invariant Measure of Estimation Error,” The American Statistician, 75, 117–123: comment by Vos and Wu
This page was built for publication: Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q434565)