Statistical aspects of divergence measures
From MaRDI portal
Publication:1096987
DOI10.1016/0378-3758(87)90063-2zbMath0634.62004OpenAlexW1978969504MaRDI QIDQ1096987
Publication date: 1987
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0378-3758(87)90063-2
monotonicityconvexityCramér-Rao inequalityisotonic functionsFisher's informationBlackwell order structuregeneralized divergence measures
Theory of statistical experiments (62B15) Statistical aspects of information-theoretic topics (62B10)
Related Items (10)
Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses ⋮ Information and random censoring ⋮ On the issue of convergence of certain divergence measures related to finding most nearly compatible probability distribution under the discrete set-up ⋮ \((R,S)\)-information radius of type \(t\) and comparison of experiments ⋮ Generalized arithmetic and geometric mean divergence measure and their statistical aspects ⋮ Generalized arithmetic and geometric mean divergence measure and their statistical aspects ⋮ Connections of generalized divergence measures with Fisher information matrix ⋮ Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems ⋮ A fuzzy probabilistic information system comparison criterion: Applications and properties ⋮ Generalized Jensen difference divergence measures and Fisher measure of information
Cites Work
- A Mathematical Theory of Communication
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- The relation between information theory and the differential geometry approach to statistics
- Information in experiments and sufficiency
- I-divergence geometry of probability distributions and minimization problems
- Comparison of experiments and information measures
- New parametric measures of information
- On a Measure of the Information Provided by an Experiment
- On the convexity of some divergence measures based on entropy functions
- On Information and Sufficiency
- On the attainment of the Cramer-Rao lower bound
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Statistical aspects of divergence measures