Sequential Lasso cum EBIC for feature selection with ultra-high dimensional feature space
DOI10.1080/01621459.2013.877275zbMATH Open1368.62205OpenAlexW2081474178MaRDI QIDQ4975573FDOQ4975573
Authors: Shan Luo, Zehua Chen
Publication date: 7 August 2017
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01621459.2013.877275
Recommendations
- Sequential feature screening for generalized linear models with sparse ultra-high dimensional data
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Ultrahigh dimensional feature selection: beyond the linear model
- Adaptive Lasso for sparse high-dimensional regression models
Linear regression; mixed models (62J05) Ridge regression; shrinkage estimators (Lasso) (62J07) Sequential estimation (62L12)
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Extended Bayesian information criteria for model selection with large model spaces
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- High-dimensional graphs and variable selection with the Lasso
- SCAD-penalized regression in high-dimensional partially linear models
- Asymptotics for Lasso-type estimators.
- Title not available (Why is that?)
- New Estimation and Model Selection Procedures for Semiparametric Modeling in Longitudinal Data Analysis
- Nonparametric independence screening in sparse ultra-high-dimensional additive models
- Nonconcave penalized likelihood with a diverging number of parameters.
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Mathematical Statistics
- Smoothly clipped absolute deviation on high dimensions
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Forward regression for ultra-high dimensional variable screening
- Improved variable selection with forward-lasso adaptive shrinkage
- FIRST: combining forward iterative selection and shrinkage in high dimensional sparse linear regression
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- Adaptive Forward-Backward Greedy Algorithm for Learning Sparse Representations
Cited In (33)
- Forward variable selection for sparse ultra-high-dimensional generalized varying coefficient models
- Forward variable selection for ultra-high dimensional quantile regression models
- A procedure of linear discrimination analysis with detected sparsity structure for high-dimensional multi-class classification
- Preselection in Lasso-type analysis for ultra-high dimensional genomic exploration
- Hard thresholding regularised logistic regression: theory and algorithms
- A sequential feature selection approach to change point detection in mean-shift change point models
- A two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression models
- Sequential profile Lasso for ultra-high-dimensional partially linear models
- A sequential approach to feature selection in high-dimensional additive models
- Forward selection for feature screening and structure identification in varying coefficient models
- Forward regression for Cox models with high-dimensional covariates
- A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems
- The EBIC and a sequential procedure for feature selection in interactive linear models with high-dimensional data
- A sequential scaled pairwise selection approach to edge detection in nonparanormal graphical models
- Which bridge estimator is the best for variable selection?
- Rank-based sequential feature selection for high-dimensional accelerated failure time models with main and interaction effects
- A data-driven line search rule for support recovery in high-dimensional data analysis
- Gini correlation for feature screening
- Graph-based sparse linear discriminant analysis for high-dimensional classification
- Edge detection in sparse Gaussian graphical models
- High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
- Quantile forward regression for high-dimensional survival data
- Tournament screening cum EBIC for feature selection with high-dimensional feature spaces
- GEE-Assisted Forward Regression for Spatial Latent Variable Models
- A semi-parametric approach to feature selection in high-dimensional linear regression models
- A primal and dual active set algorithm for truncated \(L_1\) regularized logistic regression
- Variable selection in high-dimensional sparse multiresponse linear regression models
- Sequential interaction group selection by the principle of correlation search for high-dimensional interaction models
- A sequential feature selection procedure for high-dimensional Cox proportional hazards model
- Feature selection by canonical correlation search in high-dimensional multiresponse models with complex group structures
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- Sequential feature screening for generalized linear models with sparse ultra-high dimensional data
- GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee
Uses Software
This page was built for publication: Sequential Lasso cum EBIC for feature selection with ultra-high dimensional feature space
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4975573)