Implications of the Cressie-Read family of additive divergences for information recovery
From MaRDI portal
Publication:406232
DOI10.3390/E14122427zbMATH Open1305.94018OpenAlexW2083326001MaRDI QIDQ406232FDOQ406232
Authors: Ron C. Mittelhammer, George G. Judge
Publication date: 8 September 2014
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e14122427
Recommendations
- Several applications of divergence criteria in continuous families
- A generalized divergence for statistical inference
- Some information theoretic ideas useful in statistical inference
- Robust statistical inference based on the \(C\)-divergence family
- Estimation and inference in the case of competing sets of estimating equations
conditional moment equationsCressie-Read divergenceinformation functionalsinformation theoretic methodsminimum power divergence
Cites Work
- On Information and Sufficiency
- A Mathematical Theory of Communication
- Title not available (Why is that?)
- Estimation with quadratic loss.
- Possible generalization of Boltzmann-Gibbs statistics.
- Empirical likelihood as a goodness-of-fit measure
- Goodness-of-fit statistics for discrete multivariate data
- Title not available (Why is that?)
- A new class of metric divergences on probability spaces and its applicability in statistics
- Notes on bias in estimators for simultaneous equation models.
- Entropy: the Markov ordering approach
- Large deviation strategy for inverse problem. I
- Typical support and Sanov large deviations of correlated states
This page was built for publication: Implications of the Cressie-Read family of additive divergences for information recovery
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q406232)