The relation between information theory and the differential geometry approach to statistics (Q1086923): Difference between revisions
From MaRDI portal
Added link to MaRDI item. |
Removed claim: author (P16): Item:Q770576 |
||
Property / author | |||
Property / author: L. Lorne Campbell / rank | |||
Revision as of 04:18, 21 February 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | The relation between information theory and the differential geometry approach to statistics |
scientific article |
Statements
The relation between information theory and the differential geometry approach to statistics (English)
0 references
1985
0 references
\textit{N. N. Chentsov} [Statistical decision rules and optimal inference. Transl. Math. Monogr. 53 (1982; Zbl 0484.62008)] has shown that the Riemannian metric on the probability simplex \(\sum x_ i=1\) defined by \((ds)^ 2=\sum (dx_ i)^ 2/x_ i\) has an invariance property under certain probabilistically natural mappings. No other Riemannian metric has the same property. The geometry associated with this metric is shown to lead almost automatically to measures of divergence between probability distributions which are associated with Kullback, Bhattacharyya, and Matusita. Certain vector fields are associated in a natural way with random variables. The integral curves of these vector fields yield the maximum entropy or minimum divergence estimates of probabilities. Some other consequences of this geometric view are also explored.
0 references
information theory
0 references
information measures
0 references
Fisher information matrix
0 references
Bhattacharyya distance
0 references
Kullback distance
0 references
Matusita distance
0 references
Riemannian metric
0 references
probability simplex
0 references
measures of divergence
0 references
vector fields
0 references
maximum entropy
0 references
minimum divergence estimates
0 references