Information geometry (Q5925492): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s11537-020-1920-5 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W4245686311 / rank
 
Normal rank

Revision as of 22:24, 19 March 2024

scientific article; zbMATH DE number 7324370
Language Label Description Also known as
English
Information geometry
scientific article; zbMATH DE number 7324370

    Statements

    Information geometry (English)
    0 references
    0 references
    17 March 2021
    0 references
    This paper is based on the 23rd Takagi Lectures delivered by the author on June 8, 2019, at Research Institute for Mathematical Science, Kyoto University. It is an interesting overview on topics of Information geometry. Basically Information geometry studies statistical manifolds, that is Riemannian manifolds with a symmetric cubic tensor, or equivalently Riemannian manifolds with a pair of dual affine connections. These manifolds can be considered as Riemannian manifolds whose points correspond to probability distributions. Such structures are important in many areas: mathematics and physics but also machine learning, signal processing, neuroscience. Differential geometry plays a central role in this study. Information geometry is strictly related to asymmetric divergence functions and affine differential geometry. Indeed on a manifold with such a divergence function a Riemannian structure with a pair of dual affine connections can be defined. Also a dual pair of affine connections can be defined on affine manifolds. In particular in a Riemannian manifold with a pair of flat affine connections there is a unique canonical divergence, a generalized Pythagorean theorem and a projection theorem hold. In this framework the Wasserstein distance, which concerns the distance of probability distributions, is very interesting. In this sense the author describes a divergence function obtained from the entropy-regularized Wasserstein problem and relates it to information geometry. Finally some future perspectives are discussed.
    0 references
    canonical divergence
    0 references
    dual affine connection
    0 references
    information geometry
    0 references
    Pythagorean theorem
    0 references
    semiparametric statistics
    0 references
    Wasserstein geometry
    0 references

    Identifiers