Implications of the Cressie-Read family of additive divergences for information recovery
From MaRDI portal
Publication:406232
DOI10.3390/E14122427zbMATH Open1305.94018OpenAlexW2083326001MaRDI QIDQ406232FDOQ406232
George G. Judge, Ron C. Mittelhammer
Publication date: 8 September 2014
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e14122427
conditional moment equationsCressie-Read divergenceinformation functionalsinformation theoretic methodsminimum power divergence
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On Information and Sufficiency
- A Mathematical Theory of Communication
- Possible generalization of Boltzmann-Gibbs statistics.
- Empirical likelihood as a goodness-of-fit measure
- Goodness-of-fit statistics for discrete multivariate data
- A new class of metric divergences on probability spaces and its applicability in statistics
- Notes on bias in estimators for simultaneous equation models.
- Entropy: the Markov ordering approach
- Large Deviation Strategy for Inverse Problem I
- Typical support and Sanov large deviations of correlated states
This page was built for publication: Implications of the Cressie-Read family of additive divergences for information recovery
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q406232)