Limiting properties of some measures of information
DOI10.1007/BF00050661zbMATH Open0725.62006OpenAlexW1971491488MaRDI QIDQ2277693FDOQ2277693
Publication date: 1989
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00050661
Recommendations
Fisher informationconvergence of information measuresconvergence of Radon-Nikodym derivativesphi- divergenceRényi informationCsiszár information
Statistical aspects of information-theoretic topics (62B10) Asymptotic distribution theory in statistics (62E20) Measures of information, entropy (94A17)
Cites Work
- An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the f-divergence and singularity of probability measures
- Title not available (Why is that?)
- Entropy and the central limit theorem
- Title not available (Why is that?)
- Title not available (Why is that?)
- New parametric measures of information
- Some limiting properties of Matusita's measure of distance
Cited In (9)
- Title not available (Why is that?)
- On Two Forms of Fisher's Measure of Information
- Fisher information matrix: a tool for dimension reduction, projection pursuit, independent component analysis, and more
- Fisher's information matrix and φ−divergence for finite and optimal partitions of the sample space
- Divergences without probability vectors and their applications
- Entropy, divergence and distance measures with econometric applications
- Title not available (Why is that?)
- Limits for the Precision and Value of Information from Dependent Sources
- On convergence of conditional probability measures
This page was built for publication: Limiting properties of some measures of information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2277693)