Principal component selection via adaptive regularization method and generalized information criterion
DOI10.1007/S00362-015-0691-1zbMATH Open1394.62079OpenAlexW2199019030MaRDI QIDQ513693FDOQ513693
Authors: Heewon Park, Sadanori Konishi
Publication date: 7 March 2017
Published in: Statistical Papers (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00362-015-0691-1
Recommendations
- Detecting the dimensionality for principal components model
- Automatic sparse principal component analysis
- Bayesian principal component regression with data-driven component selection
- Principal component analysis in very high-dimensional spaces
- Sparse principal component regression with adaptive loading
principal component analysisinformation criterionadaptive \(\mathrm L_1\)-type penaltysparse regression modeling
Factor analysis and principal components; correspondence analysis (62H25) Generalized linear models (logistic models) (62J12)
Cites Work
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- Estimating the dimension of a model
- The Adaptive Lasso and Its Oracle Properties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Regularization and Variable Selection Via the Elastic Net
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Information criteria and statistical modeling.
- Variable selection for generalized varying coefficient models with longitudinal data
- Variable selection via the weighted group Lasso for factor analysis models
- Generalised information criteria in model selection
- Variable selection in high-dimensional double generalized linear models
- Combining two-parameter and principal component regression estimators
- Selecting the Number of Principal Components in Functional Data
- Lag weighted lasso for time series model
- Ultrahigh dimensional variable selection through the penalized maximum trimmed likelihood estimator
- Robust estimation for spatial semiparametric varying coefficient partially linear regression
Cited In (4)
- Outlier-resistant high-dimensional regression modelling based on distribution-free outlier detection and tuning parameter selection
- Convergence rate of eigenvector empirical spectral distribution of large Wigner matrices
- A generalized information criterion for high-dimensional PCA rank selection
- L1-norm-based principal component analysis with adaptive regularization
This page was built for publication: Principal component selection via adaptive regularization method and generalized information criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q513693)