Sequential Lasso cum EBIC for feature selection with ultra-high dimensional feature space
From MaRDI portal
Publication:4975573
Recommendations
- Sequential feature screening for generalized linear models with sparse ultra-high dimensional data
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Ultrahigh dimensional feature selection: beyond the linear model
- Adaptive Lasso for sparse high-dimensional regression models
Cites work
- scientific article; zbMATH DE number 3841083 (Why is no real title available?)
- Adaptive Forward-Backward Greedy Algorithm for Learning Sparse Representations
- Asymptotics for Lasso-type estimators.
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- Extended Bayesian information criteria for model selection with large model spaces
- FIRST: combining forward iterative selection and shrinkage in high dimensional sparse linear regression
- Forward regression for ultra-high dimensional variable screening
- High-dimensional graphs and variable selection with the Lasso
- Improved variable selection with forward-lasso adaptive shrinkage
- Least angle regression. (With discussion)
- Mathematical Statistics
- Nearly unbiased variable selection under minimax concave penalty
- New Estimation and Model Selection Procedures for Semiparametric Modeling in Longitudinal Data Analysis
- Nonconcave penalized likelihood with a diverging number of parameters.
- Nonparametric independence screening in sparse ultra-high-dimensional additive models
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- SCAD-penalized regression in high-dimensional partially linear models
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Smoothly clipped absolute deviation on high dimensions
- The Adaptive Lasso and Its Oracle Properties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(33)- Forward variable selection for sparse ultra-high-dimensional generalized varying coefficient models
- Forward variable selection for ultra-high dimensional quantile regression models
- A procedure of linear discrimination analysis with detected sparsity structure for high-dimensional multi-class classification
- Preselection in Lasso-type analysis for ultra-high dimensional genomic exploration
- Hard thresholding regularised logistic regression: theory and algorithms
- A sequential feature selection approach to change point detection in mean-shift change point models
- A two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression models
- Sequential profile Lasso for ultra-high-dimensional partially linear models
- A sequential approach to feature selection in high-dimensional additive models
- Forward regression for Cox models with high-dimensional covariates
- Forward selection for feature screening and structure identification in varying coefficient models
- A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems
- The EBIC and a sequential procedure for feature selection in interactive linear models with high-dimensional data
- A sequential scaled pairwise selection approach to edge detection in nonparanormal graphical models
- Which bridge estimator is the best for variable selection?
- Rank-based sequential feature selection for high-dimensional accelerated failure time models with main and interaction effects
- A data-driven line search rule for support recovery in high-dimensional data analysis
- Gini correlation for feature screening
- Graph-based sparse linear discriminant analysis for high-dimensional classification
- Edge detection in sparse Gaussian graphical models
- High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
- Tournament screening cum EBIC for feature selection with high-dimensional feature spaces
- Quantile forward regression for high-dimensional survival data
- GEE-Assisted Forward Regression for Spatial Latent Variable Models
- A semi-parametric approach to feature selection in high-dimensional linear regression models
- Variable selection in high-dimensional sparse multiresponse linear regression models
- A primal and dual active set algorithm for truncated \(L_1\) regularized logistic regression
- Sequential interaction group selection by the principle of correlation search for high-dimensional interaction models
- A sequential feature selection procedure for high-dimensional Cox proportional hazards model
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- Sequential feature screening for generalized linear models with sparse ultra-high dimensional data
- GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee
- Feature selection by canonical correlation search in high-dimensional multiresponse models with complex group structures
This page was built for publication: Sequential Lasso cum EBIC for feature selection with ultra-high dimensional feature space
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4975573)