Implications of the Cressie-Read family of additive divergences for information recovery
From MaRDI portal
(Redirected from Publication:406232)
Recommendations
- Several applications of divergence criteria in continuous families
- A generalized divergence for statistical inference
- Some information theoretic ideas useful in statistical inference
- Robust statistical inference based on the \(C\)-divergence family
- Estimation and inference in the case of competing sets of estimating equations
Cites work
- scientific article; zbMATH DE number 3911472 (Why is no real title available?)
- scientific article; zbMATH DE number 3614055 (Why is no real title available?)
- A Mathematical Theory of Communication
- A new class of metric divergences on probability spaces and its applicability in statistics
- Empirical likelihood as a goodness-of-fit measure
- Entropy: the Markov ordering approach
- Estimation with quadratic loss.
- Goodness-of-fit statistics for discrete multivariate data
- Large deviation strategy for inverse problem. I
- Notes on bias in estimators for simultaneous equation models.
- On Information and Sufficiency
- Possible generalization of Boltzmann-Gibbs statistics.
- Typical support and Sanov large deviations of correlated states
This page was built for publication: Implications of the Cressie-Read family of additive divergences for information recovery
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q406232)