Online Principal Component Analysis in High Dimension: Which Algorithm to Choose?
From MaRDI portal
Publication:6086546
DOI10.1111/insr.12220arXiv1511.03688OpenAlexW2964163643MaRDI QIDQ6086546
Publication date: 10 November 2023
Published in: International Statistical Review (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1511.03688
perturbation methodsstochastic gradienteigenvalue decompositionincremental SVDgeneralised Hebbian algorithm
Related Items
Online sparse sliced inverse regression for high-dimensional streaming data, Convergence analysis of Oja's iteration for solving online PCA with nonzero-mean samples, Chunk-wise regularised PCA-based imputation of missing data, Incremental modelling for compositional data streams, Quantifying deviations from separability in space-time functional processes, Minimum cost‐compression risk in principal component analysis, Stochastic Gauss-Newton algorithms for online PCA, Online inference in high-dimensional generalized linear models with streaming data, Statistical Analysis of Random Objects Via Metric Measure Laplacians, Sparse online principal component analysis for parameter estimation in factor model, Widening the scope of an eigenvector stochastic approximation process and application to streaming PCA and related methods, Unsupervised streaming anomaly detection for instrumented infrastructure, Recursive principal component analysis for model order reduction with application in nonlinear Bayesian filtering, PLS for Big Data: a unified parallel algorithm for regularised group PLS
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- Low-dimensional tracking of association structures in categorical data
- Low-rank incremental methods for computing dominant singular subspaces
- Adaptive algorithms for first principal eigenvector computation
- Global convergence of Oja's PCA learning algorithm with a non-zero-approaching adaptive learning rate
- On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix
- Rank-one modification of the symmetric eigenproblem
- Multiple imputation in principal component analysis
- Principal component analysis.
- Functional data analysis.
- Almost sure convergence of stochastic gradient processes with matrix step sizes
- Method of stochastic approximation in the determination of the largest eigenvalue of the mathematical expectation of random matrices
- Sequential Karhunen-Loeve basis extraction and its application to images
- Multivariate data analysis: The French way
- Acceleration of Stochastic Approximation by Averaging
- A Stable and Efficient Algorithm for the Rank-One Modification of the Symmetric Eigenproblem
- ARPACK Users' Guide
- On Updating Problems in Latent Semantic Indexing
- Online Principal Components Analysis
- Some Modified Matrix Eigenvalue Problems