Exploring dimension learning via a penalized probabilistic principal component analysis
From MaRDI portal
Publication:5887975
DOI10.1080/00949655.2022.2100890OpenAlexW4289518713MaRDI QIDQ5887975FDOQ5887975
Authors: Wei-Qi Deng, Radu V. Craiu
Publication date: 21 April 2023
Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.07548
model selectionprincipal component analysisprofile likelihoodpenalizationdimension estimationprobabilistic principal component analysis
Cites Work
- Sparse estimation of a covariance matrix
- Principal component analysis.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Determining the Number of Factors in Approximate Factor Models
- Automatic dimensionality selection from the scree plot via the use of profile likelihood
- On the distribution of the largest eigenvalue in principal components analysis
- Shrinkage tuning parameter selection with a diverging number of parameters
- Bayes Factors
- The Optimal Hard Threshold for Singular Values is <inline-formula> <tex-math notation="TeX">\(4/\sqrt {3}\) </tex-math></inline-formula>
- TESTS OF SIGNIFICANCE FOR THE LATENT ROOTS OF COVARIANCE AND CORRELATION MATRICES
- Title not available (Why is that?)
- Selecting the number of principal components: estimation of the true rank of a noisy matrix
- Probabilistic Principal Component Analysis
- Some hypothesis tests for the covariance matrix when the dimension is large compared to the sample size
- A high-dimensional test for the equality of the smallest eigenvalues of a covariance matrix
- Optimal detection of sparse principal components in high dimension
- Model Averaging and Dimension Selection for the Singular Value Decomposition
- Selecting the number of components in principal component analysis using cross-validation approximations
- Title not available (Why is that?)
- Practical approaches to principal component analysis in the presence of missing values
- Automatic PCA dimension selection for high dimensional data and small sample sizes
- Inferring the eigenvalues of covariance matrices from limited, noisy data
- On estimation of the noise variance in high dimensional probabilistic principal component analysis
- Exact dimensionality selection for Bayesian PCA
- Likelihood ratio test for partial sphericity in high and ultra-high dimensions
- Estimation of the number of spikes, possibly equal, in the high-dimensional case
Cited In (2)
Uses Software
This page was built for publication: Exploring dimension learning via a penalized probabilistic principal component analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5887975)