Differential-geometrical methods in statistics (Q2266300): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Removed claim: author (P16): Item:Q590674
Import240304020342 (talk | contribs)
Set profile property.
 
(One intermediate revision by one other user not shown)
Property / author
 
Property / author: Shun-ichi Amari / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank

Latest revision as of 06:30, 5 March 2024

scientific article
Language Label Description Also known as
English
Differential-geometrical methods in statistics
scientific article

    Statements

    Differential-geometrical methods in statistics (English)
    0 references
    1985
    0 references
    A parametric statistical model, i.e. a family of probability measures \((P_{\theta})\), where \(\theta\) runs through an open subset of some Euclidean space, may be regarded as a differential manifold. It is natural to furnish it with the Riemannian metric, induced by the Fisher information tensor. One might hope that this geometry tells something about statistical properties of the model, on the other hand several non- metrical distances like Kullback-Leibler-distance have been applied in different statistical situations with profit. This indicates that the Riemannian approach is too narrow for statistical purposes and additional geometrical concepts are needed. The state of affairs today can be read from this monograph of the author who contributed substantially to this theory. The book is divided into two parts. In the first part fundamental differential-geometric concepts for statistical manifolds are introduced (starting with tangent space and the like, which makes the book accessible also for those, who lost all their differential-geometric knowledge). Contrarily to ordinary differential geometry a whole family of affine connections is defined. It contains the Riemannian connection induced by Fisher information, but other non-Riemannian connections turn out to be of even greater significance. Thus one has different measures of curvature and different notions of flatness and the like. Each of the affine connections is coupled to another one by a concept of duality. This duality is used to introduce a family of divergence measures between probability distributions, which include Kullback-Leibler's distance, Hellinger's distance, Csiszar's divergence, etc. Statistical manifolds carry a geometrical structure, which apparently has not been considered so far. The second part of the book contains applications to statistical inference, especially higher-order theory. Edgeworth expansions of the distribution of certain sufficient statistics are given explicitly in geometric terms. In estimation as well as in test theory curvatures related to different connections are shown to come into play. Other key- words are: Interval-estimators, first-, second-, third-order efficiency, ancillarity, conditional inference, nuisance parameters, jackknifing. Altogether the book presents a readable introduction to a theory, which promises interesting developments in the future.
    0 references
    Fisher information tensor
    0 references
    Kullback-Leibler-distance
    0 references
    concept of duality
    0 references
    divergence measures
    0 references
    Hellinger's distance
    0 references
    Csiszar's divergence
    0 references
    Statistical manifolds
    0 references
    higher-order theory
    0 references
    Edgeworth expansions
    0 references
    sufficient statistics
    0 references
    Interval-estimators
    0 references
    efficiency
    0 references
    ancillarity
    0 references
    conditional inference
    0 references
    nuisance parameters
    0 references
    jackknifing
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references