Geometry-aware principal component analysis for symmetric positive definite matrices
DOI10.1007/S10994-016-5605-5zbMATH Open1459.62236OpenAlexW2554291319MaRDI QIDQ2398090FDOQ2398090
Inbal Horev, Masashi Sugiyama, Florian Yger
Publication date: 15 August 2017
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-016-5605-5
Recommendations
- Geometric Means in a Novel Vector Space Structure on Symmetric Positive‐Definite Matrices
- A Differential Geometric Approach to the Geometric Mean of Symmetric Positive-Definite Matrices
- Geometrical inverse preconditioning for symmetric positive definite matrices
- A geometric perspective on the singular value decomposition
- Geometric component analysis and its applications to data analysis
- Riemannian Geometry of Symmetric Positive Definite Matrices via Cholesky Decomposition
- scientific article; zbMATH DE number 6129459
- Tensor symmetrization and its applications in generalized principal component analysis
- Complex principal component analysis: theory and geometrical aspects
- The geometry of matrix eigenvalue methods
dimensionality reductionGrassmann manifoldPCARiemannian geometrysymmetric positive definite (SPD) manifold
Factor analysis and principal components; correspondence analysis (62H25) Image analysis in multivariate analysis (62H35) Statistics on manifolds (62R30)
Cites Work
- Manopt, a Matlab toolbox for optimization on manifolds
- Principal component analysis.
- Title not available (Why is that?)
- The Geometry of Algorithms with Orthogonality Constraints
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Improved Estimation of Eigenvalues and Eigenvectors of Covariance Matrices Using Their Sample Estimates
- Geometric Means in a Novel Vector Space Structure on Symmetric Positive‐Definite Matrices
- Information and complexity in statistical modeling.
- Title not available (Why is that?)
- The implicit function theorem. History, theory, and applications
- Asymptotic Theory for Principal Component Analysis
- Positive definite matrices
- A Riemannian framework for tensor computing
- Title not available (Why is that?)
- Positive definite matrices and the S-divergence
- Log-determinant divergences revisited: alpha-beta and gamma log-det divergences
- From Manifold to Manifold: Geometry-Aware Dimensionality Reduction for SPD Matrices
- Stochastic Gradient Descent on Riemannian Manifolds
- Non-Euclidean statistics for covariance matrices, with applications to diffusion tensor imaging
- The Riemannian Mean of Positive Matrices
- Computing the Fréchet Derivative of the Matrix Exponential, with an Application to Condition Number Estimation
Cited In (9)
- Probabilistic learning vector quantization on manifold of symmetric positive definite matrices
- cCorrGAN: conditional correlation GAN for learning empirical conditional distributions in the elliptope
- Goal-Oriented Optimal Approximations of Bayesian Linear Inverse Problems
- Minimum cost‐compression risk in principal component analysis
- Sion’s Minimax Theorem in Geodesic Metric Spaces and a Riemannian Extragradient Algorithm
- Complex principal component analysis: theory and geometrical aspects
- Riemannian Hamiltonian Methods for Min-Max Optimization on Manifolds
- Unsupervised manifold learning with polynomial mapping on symmetric positive definite matrices
- Faster Riemannian Newton-type optimization by subsampling and cubic regularization
Uses Software
This page was built for publication: Geometry-aware principal component analysis for symmetric positive definite matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2398090)