High dimensional ordinary least squares projection for screening variables
From MaRDI portal
Publication:5378148
Abstract: Variable selection is a challenging issue in statistical applications when the number of predictors far exceeds the number of observations . In this ultra-high dimensional setting, the sure independence screening (SIS) procedure was introduced to significantly reduce the dimensionality by preserving the true model with overwhelming probability, before a refined second stage analysis. However, the aforementioned sure screening property strongly relies on the assumption that the important variables in the model have large marginal correlations with the response, which rarely holds in reality. To overcome this, we propose a novel and simple screening technique called the high-dimensional ordinary least-squares projection (HOLP). We show that HOLP possesses the sure screening property and gives consistent variable selection without the strong correlation assumption, and has a low computational complexity. A ridge type HOLP procedure is also discussed. Simulation study shows that HOLP performs competitively compared to many other marginal correlation based methods. An application to a mammalian eye disease data illustrates the attractiveness of HOLP.
Recommendations
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Ultra-high dimensional variable screening via Gram-Schmidt orthogonalization
- Adaptive model-free sure independence screening
- Dynamic tilted current correlation for high dimensional variable screening
- Ultrahigh dimensional feature screening via projection
Cited in
(36)- Interaction screening via canonical correlation
- Two‐stage penalized regression screening to detect biomarker–treatment interactions in randomized clinical trials
- Covariate Information Number for Feature Screening in Ultrahigh-Dimensional Supervised Problems
- Robust group variable screening based on maximum Lq-likelihood estimation
- Are Latent Factor Regression and Sparse Regression Adequate?
- Cluster feature selection in high-dimensional linear models
- Learning sparse conditional distribution: an efficient kernel-based approach
- Nonparametric augmented probability weighting with sparsity
- Structure learning via unstructured kernel-based M-estimation
- Scalable inference for high-dimensional precision matrix
- Partition-based feature screening for categorical data via RKHS embeddings
- Conditional characteristic feature screening for massive imbalanced data
- Threshold Selection in Feature Screening for Error Rate Control
- A selective overview of feature screening methods with applications to neuroimaging data
- Prior Knowledge Guided Ultra-High Dimensional Variable Screening With Application to Neuroimaging Data
- A fast adaptive Lasso for the cox regression via safe screening rules
- Covariance-insured screening
- Ultra-high dimensional variable screening via Gram-Schmidt orthogonalization
- Risk spillover network structure learning for correlated financial assets: a directed acyclic graph approach
- Efficient kernel-based variable selection with sparsistency
- Uniform joint screening for ultra-high dimensional graphical models
- Screen then select: a strategy for correlated predictors in high-dimensional quantile regression
- Variable selection for categorical response: a comparative study
- Network-based feature screening with applications to genome data
- Factor-Adjusted Regularized Model Selection
- Dynamic tilted current correlation for high dimensional variable screening
- A data-driven approach to conditional screening of high-dimensional variables
- High-dimensional variable screening under multicollinearity
- RaSE: A Variable Screening Framework via Random Subspace Ensembles
- Scalable and efficient inference via CPE
- Support recovery of Gaussian graphical model with false discovery rate control
- Likelihood ratio test in multivariate linear regression: from low to high dimension
- Subgroup analysis method for accelerated failure time model
- Grouped variable screening for ultra-high dimensional data for linear model
- Cross-Trait Prediction Accuracy of Summary Statistics in Genome-Wide Association Studies
- On the dimension effect of regularized linear discriminant analysis
This page was built for publication: High dimensional ordinary least squares projection for screening variables
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378148)