The relation between information theory and the differential geometry approach to statistics
From MaRDI portal
Publication:1086923
DOI10.1016/0020-0255(85)90050-7zbMath0609.62006OpenAlexW2052608203WikidataQ115368296 ScholiaQ115368296MaRDI QIDQ1086923
Publication date: 1985
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0020-0255(85)90050-7
Riemannian metricmaximum entropyinformation theoryFisher information matrixvector fieldsinformation measuresBhattacharyya distanceprobability simplexMatusita distanceKullback distancemeasures of divergenceminimum divergence estimates
Foundations and philosophical topics in statistics (62A01) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10) Classical differential geometry (53A99)
Related Items
Metric bounds on losses in adaptive coding, Statistical aspects of divergence measures, CHARACTERIZING THE DEPOLARIZING QUANTUM CHANNEL IN TERMS OF RIEMANNIAN GEOMETRY, Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses, Geometry of canonical correlation on the state space of a quantum system, Fisher information under restriction of Shannon information in multi-terminal situations, Geometry of quantum inference, Improved neural networks based on mutual information via information geometry, Distribution metrics and image segmentation, The curvature induced by covariance, Connections on Non-Parametric Statistical Manifolds by Orlicz Space Geometry, Statistical distance and the geometry of quantum states, Connections of generalized divergence measures with Fisher information matrix, Generalized Jensen difference divergence measures and Fisher measure of information
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- Decision rule, based on the distance, for the classification problem
- Accounting for intrinsic nonlinearity in nonlinear regression parameter inference regions
- Information geometry in functional spaces of classical and quantum finite statistical systems
- Defining the curvature of a statistical problem (with applications to second order efficiency)
- On measures of information and their characterizations
- Differential geometry of curved exponential families. Curvatures and information loss
- An Extended Cencov Characterization of the Information Metric
- The Renyi redundancy of generalized Huffman codes
- A new mathematical framework for the study of linkage and selection
- A coding theorem and Rényi's entropy
- Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and Estimation