Nonparametric information geometry: from divergence function to referential-representational biduality on statistical manifolds (Q280743): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Importer (talk | contribs)
Changed an Item
Property / review text
 
Summary: Divergence functions are the non-symmetric ``distance'' on the manifold \(\mathcal M_\theta\) of parametric probability density functions over a measure space \((X,\mu)\). Classical information geometry prescribes on \(\mathcal M_\theta\): (i) a Riemannian metric given by the Fisher information; (ii) a pair of dual connections (giving rise to the family of \(\alpha\)-connections) that preserve the metric under parallel transport by their joint actions; and (iii) a family of divergence functions \(\alpha\)-divergence) defined on \(\mathcal M_\theta\times\mathcal M_\theta\), which induce the metric and the dual connections. Here, we construct an extension of this differential geometric structure from \(\mathcal M_\theta\) (that of parametric probability density functions) to the manifold \(\mathcal M\) of non-parametric functions on \(X\), removing the positivity and normalization constraints. The generalized Fisher information and \(\alpha\)-connections on \(\mathcal M\) are induced by an \(\alpha\)-parameterized family of divergence functions, reflecting the fundamental convex inequality associated with any smooth and strictly convex function. The infinite-dimensional manifold \(\mathcal M\) has zero curvature for all these \(\alpha\)-connections; hence, the generally non-zero curvature of \(\mathcal M_\theta\) can be interpreted as arising from an embedding of \(\mathcal M_\theta\) into \(\mathcal M\). Furthermore, when a parametric model (after a monotonic scaling) forms an affine submanifold, its natural and expectation parameters form biorthogonal coordinates, and such a submanifold is dually flat for \(\alpha=\pm 1\), generalizing the results of Amari's \(\alpha\)-embedding. The present analysis illuminates two different types of duality in information geometry, one concerning the referential status of a point (measurable function) expressed in the divergence function (``referential duality'') and the other concerning its representation under an arbitrary monotone scaling (``representational duality'').
Property / review text: Summary: Divergence functions are the non-symmetric ``distance'' on the manifold \(\mathcal M_\theta\) of parametric probability density functions over a measure space \((X,\mu)\). Classical information geometry prescribes on \(\mathcal M_\theta\): (i) a Riemannian metric given by the Fisher information; (ii) a pair of dual connections (giving rise to the family of \(\alpha\)-connections) that preserve the metric under parallel transport by their joint actions; and (iii) a family of divergence functions \(\alpha\)-divergence) defined on \(\mathcal M_\theta\times\mathcal M_\theta\), which induce the metric and the dual connections. Here, we construct an extension of this differential geometric structure from \(\mathcal M_\theta\) (that of parametric probability density functions) to the manifold \(\mathcal M\) of non-parametric functions on \(X\), removing the positivity and normalization constraints. The generalized Fisher information and \(\alpha\)-connections on \(\mathcal M\) are induced by an \(\alpha\)-parameterized family of divergence functions, reflecting the fundamental convex inequality associated with any smooth and strictly convex function. The infinite-dimensional manifold \(\mathcal M\) has zero curvature for all these \(\alpha\)-connections; hence, the generally non-zero curvature of \(\mathcal M_\theta\) can be interpreted as arising from an embedding of \(\mathcal M_\theta\) into \(\mathcal M\). Furthermore, when a parametric model (after a monotonic scaling) forms an affine submanifold, its natural and expectation parameters form biorthogonal coordinates, and such a submanifold is dually flat for \(\alpha=\pm 1\), generalizing the results of Amari's \(\alpha\)-embedding. The present analysis illuminates two different types of duality in information geometry, one concerning the referential status of a point (measurable function) expressed in the divergence function (``referential duality'') and the other concerning its representation under an arbitrary monotone scaling (``representational duality''). / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 53C20 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 62B10 / rank
 
Normal rank
Property / zbMATH DE Number
 
Property / zbMATH DE Number: 6578429 / rank
 
Normal rank
Property / zbMATH Keywords
 
Fisher information
Property / zbMATH Keywords: Fisher information / rank
 
Normal rank
Property / zbMATH Keywords
 
alpha-connection
Property / zbMATH Keywords: alpha-connection / rank
 
Normal rank
Property / zbMATH Keywords
 
infinite-dimensional manifold
Property / zbMATH Keywords: infinite-dimensional manifold / rank
 
Normal rank
Property / zbMATH Keywords
 
convex function
Property / zbMATH Keywords: convex function / rank
 
Normal rank

Revision as of 17:30, 27 June 2023

scientific article
Language Label Description Also known as
English
Nonparametric information geometry: from divergence function to referential-representational biduality on statistical manifolds
scientific article

    Statements

    Nonparametric information geometry: from divergence function to referential-representational biduality on statistical manifolds (English)
    0 references
    0 references
    0 references
    10 May 2016
    0 references
    Summary: Divergence functions are the non-symmetric ``distance'' on the manifold \(\mathcal M_\theta\) of parametric probability density functions over a measure space \((X,\mu)\). Classical information geometry prescribes on \(\mathcal M_\theta\): (i) a Riemannian metric given by the Fisher information; (ii) a pair of dual connections (giving rise to the family of \(\alpha\)-connections) that preserve the metric under parallel transport by their joint actions; and (iii) a family of divergence functions \(\alpha\)-divergence) defined on \(\mathcal M_\theta\times\mathcal M_\theta\), which induce the metric and the dual connections. Here, we construct an extension of this differential geometric structure from \(\mathcal M_\theta\) (that of parametric probability density functions) to the manifold \(\mathcal M\) of non-parametric functions on \(X\), removing the positivity and normalization constraints. The generalized Fisher information and \(\alpha\)-connections on \(\mathcal M\) are induced by an \(\alpha\)-parameterized family of divergence functions, reflecting the fundamental convex inequality associated with any smooth and strictly convex function. The infinite-dimensional manifold \(\mathcal M\) has zero curvature for all these \(\alpha\)-connections; hence, the generally non-zero curvature of \(\mathcal M_\theta\) can be interpreted as arising from an embedding of \(\mathcal M_\theta\) into \(\mathcal M\). Furthermore, when a parametric model (after a monotonic scaling) forms an affine submanifold, its natural and expectation parameters form biorthogonal coordinates, and such a submanifold is dually flat for \(\alpha=\pm 1\), generalizing the results of Amari's \(\alpha\)-embedding. The present analysis illuminates two different types of duality in information geometry, one concerning the referential status of a point (measurable function) expressed in the divergence function (``referential duality'') and the other concerning its representation under an arbitrary monotone scaling (``representational duality'').
    0 references
    Fisher information
    0 references
    alpha-connection
    0 references
    infinite-dimensional manifold
    0 references
    convex function
    0 references

    Identifiers