The relation between information theory and the differential geometry approach to statistics (Q1086923): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/0020-0255(85)90050-7 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2052608203 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On measures of information and their characterizations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Differential geometry of curved exponential families. Curvatures and information loss / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5841953 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Renyi redundancy of generalized Huffman codes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy differential metric, distance and divergence measures in probability spaces: A unified approach / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Extended Cencov Characterization of the Information Metric / rank
 
Normal rank
Property / cites work
 
Property / cites work: A coding theorem and Rényi's entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3943797 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Defining the curvature of a statistical problem (with applications to second order efficiency) / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5595624 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accounting for intrinsic nonlinearity in nonlinear regression parameter inference regions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5341405 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information geometry in functional spaces of classical and quantum finite statistical systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5529067 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and Estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Decision rule, based on the distance, for the classification problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5845474 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new mathematical framework for the study of linkage and selection / rank
 
Normal rank

Latest revision as of 17:17, 17 June 2024

scientific article
Language Label Description Also known as
English
The relation between information theory and the differential geometry approach to statistics
scientific article

    Statements

    The relation between information theory and the differential geometry approach to statistics (English)
    0 references
    1985
    0 references
    \textit{N. N. Chentsov} [Statistical decision rules and optimal inference. Transl. Math. Monogr. 53 (1982; Zbl 0484.62008)] has shown that the Riemannian metric on the probability simplex \(\sum x_ i=1\) defined by \((ds)^ 2=\sum (dx_ i)^ 2/x_ i\) has an invariance property under certain probabilistically natural mappings. No other Riemannian metric has the same property. The geometry associated with this metric is shown to lead almost automatically to measures of divergence between probability distributions which are associated with Kullback, Bhattacharyya, and Matusita. Certain vector fields are associated in a natural way with random variables. The integral curves of these vector fields yield the maximum entropy or minimum divergence estimates of probabilities. Some other consequences of this geometric view are also explored.
    0 references
    information theory
    0 references
    information measures
    0 references
    Fisher information matrix
    0 references
    Bhattacharyya distance
    0 references
    Kullback distance
    0 references
    Matusita distance
    0 references
    Riemannian metric
    0 references
    probability simplex
    0 references
    measures of divergence
    0 references
    vector fields
    0 references
    maximum entropy
    0 references
    minimum divergence estimates
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references