Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses
From MaRDI portal
Publication:1358818
DOI10.1016/0020-0255(95)00017-JzbMath0877.94018OpenAlexW2060198466MaRDI QIDQ1358818
Domingo Morales, Leandro Pardo, M. Luisa Menendez, Miguel Salicrú
Publication date: 23 June 1997
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0020-0255(95)00017-j
Cites Work
- Informative geometry of probability spaces
- The relation between information theory and the differential geometry approach to statistics
- Statistical aspects of divergence measures
- Connections of generalized divergence measures with Fisher information matrix
- Asymptotic properties of divergence statistics in a stratified random sampling and its applications to test statistical hypotheses
- Differential-geometrical methods in statistics
- A distance between multivariate normal distributions based in an embedding into the Siegel group
- New parametric measures of information
- Information radius
- On Information and Sufficiency
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses