Penalized orthogonal-components regression for large \(p\) small \(n\) data
From MaRDI portal
Publication:1952001
DOI10.1214/09-EJS354zbMath1326.62149arXiv0811.4167OpenAlexW2138769430MaRDI QIDQ1952001
Yanzhu Lin, Dabao Zhang, Min Zhang
Publication date: 27 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0811.4167
POCREsupervised dimension reduction\(p\gg n\) dataempirical Bayes thresholdinglatent-variable modelsparse predictors
Ridge regression; shrinkage estimators (Lasso) (62J07) Measures of association (correlation, canonical correlation, etc.) (62H20) General nonlinear regression (62J02)
Related Items (3)
Generalized orthogonal components regression for high dimensional generalized linear models ⋮ Consistent High-Dimensional Bayesian Variable Selection via Penalized Credible Regions ⋮ Bayesian variable selection with shrinking and diffusing priors
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fisher lecture: Dimension reduction in regression
- Sparse principal component analysis via regularized low rank matrix approximation
- Heuristics of instability and stabilization in model selection
- Needles and straw in haystacks: Empirical Bayes estimates of possibly sparse sequences
- A Sparse PLS for Variable Selection when Integrating Omics Data
- The Bayesian Lasso
- An Interpretation of Partial Least Squares
- Ideal spatial adaptation by wavelet shrinkage
- Sparsity and Smoothness Via the Fused Lasso
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Large-Scale Simultaneous Hypothesis Testing
- Prediction by Supervised Principal Components
This page was built for publication: Penalized orthogonal-components regression for large \(p\) small \(n\) data