Estimating common principal components in high dimensions
From MaRDI portal
Abstract: We consider the problem of minimizing an objective function that depends on an orthonormal matrix. This situation is encountered when looking for common principal components, for example, and the Flury method is a popular approach. However, the Flury method is not effective for higher dimensional problems. We obtain several simple majorization-minizmation (MM) algorithms that provide solutions to this problem and are effective in higher dimensions. We then use simulated data to compare them with other approaches in terms of convergence and computational time.
Recommendations
- Efficient R-estimation of principal and common principal components
- On the number of principal components in high dimensions
- Principal component analysis in very high-dimensional spaces
- scientific article; zbMATH DE number 41813
- Stepwise estimation of common principal components
- Asymptotic theory for common principal component analysis
- Common principal components for dependent random vectors
- On consistency and sparsity for principal components analysis in high dimensions
Cites work
- scientific article; zbMATH DE number 5717607 (Why is no real title available?)
- scientific article; zbMATH DE number 3567782 (Why is no real title available?)
- scientific article; zbMATH DE number 5223994 (Why is no real title available?)
- An Algorithm for Simultaneous Orthogonal Transformation of Several Positive Definite Symmetric Matrices to Nearly Diagonal Form
- An efficient \(k\) nearest neighbor search for multivariate time series
- Between-group analysis with heterogeneous covariance matrices: The common principal component model
- Detecting Features in Spatial Point Processes with Clutter via Model-Based Clustering
- Discussion of `Model-based clustering and classification with non-normal mixture distributions' by Lee and McLachlan
- Estimating correlation matrices that have common eigenvectors.
- Estimating the dimension of a model
- High-dimensional data clustering
- MM algorithms for generalized Bradley-Terry models.
- Model-Based Gaussian and Non-Gaussian Clustering
- Model-based cluster and discriminant analysis with the MIXMOD software
- Model-based clustering, classification, and discriminant analysis via mixtures of multivariate \(t\)-distributions
- Orthogonal Stiefel manifold optimization for eigen-decomposed covariance parameter estimation in mixture models
- Principal component models for correlation matrices
- Regularization of the location model in discrimination with mixed discrete and continuous variables
- Setting up alternating least squares and iterative majorization algorithms for solving various matrix optimization problems
- Spectral models for covariance matrices
Cited in
(37)- Location and scale mixtures of Gaussians with flexible tail behaviour: properties, inference and application to multivariate clustering
- Mixture model averaging for clustering
- Convergence and prediction of principal component scores in high-dimensional settings
- An evolutionary algorithm with crossover and mutation for model-based clustering
- On parsimonious models for modeling matrix data
- Mixture model selection via hierarchical BIC
- Parsimony and parameter estimation for mixtures of multivariate leptokurtic-normal distributions
- Parsimonious mixtures for the analysis of tensor-variate data
- Parsimonious seemingly unrelated contaminated normal cluster-weighted models
- Skew multiple scaled mixtures of normal distributions with flexible tail behavior and their application to clustering
- A variational approximations-DIC rubric for parameter estimation and mixture model selection within a family setting
- Hypothesis Testing for Mixture Model Selection
- Model-based clustering with nested Gaussian clusters
- Model-based clustering
- Factor and hybrid components for model-based clustering
- Improving model choice in classification: an approach based on clustering of covariance matrices
- Constrained parsimonious model-based clustering
- On the use of the matrix-variate tail-inflated normal distribution for parsimonious mixture modeling
- A mixture of generalized hyperbolic factor analyzers
- Model-based clustering via new parsimonious mixtures of heavy-tailed distributions
- Semiparametric partial common principal component analysis for covariance matrices
- mixture
- Multivariate response and parsimony for Gaussian cluster-weighted models
- Flexible clustering of high-dimensional data via mixtures of joint generalized hyperbolic distributions
- A family of parsimonious mixtures of multivariate Poisson-lognormal distributions for clustering multivariate count data
- Logistic biplot for nominal data
- Model based clustering of high-dimensional binary data
- Preliminary Multiple-Test Estimation, With Applications to k-Sample Covariance Estimation
- A robust approach to model-based classification based on trimming and constraints. Semi-supervised learning in presence of outliers and label noise
- On the number of principal components in high dimensions
- In the pursuit of sparseness: a new rank-preserving penalty for a finite mixture of factor analyzers
- Chimeral clustering
- Multivariate cluster-weighted models based on seemingly unrelated linear regression
- A mixture of coalesced generalized hyperbolic distributions
- Detecting approximate replicate components of a high-dimensional random vector with latent structure
- A dual subspace parsimonious mixture of matrix normal distributions
- Parsimonious hidden Markov models for matrix-variate longitudinal data
This page was built for publication: Estimating common principal components in high dimensions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q95891)