Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space
From MaRDI portal
Publication:4975573
DOI10.1080/01621459.2013.877275zbMath1368.62205OpenAlexW2081474178MaRDI QIDQ4975573
Publication date: 7 August 2017
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01621459.2013.877275
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Sequential estimation (62L12)
Related Items
Sequential profile Lasso for ultra-high-dimensional partially linear models, Edge detection in sparse Gaussian graphical models, Forward variable selection for sparse ultra-high-dimensional generalized varying coefficient models, GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee, GEE-Assisted Forward Regression for Spatial Latent Variable Models, Hard Thresholding Regularised Logistic Regression: Theory and Algorithms, A data-driven line search rule for support recovery in high-dimensional data analysis, Unnamed Item, Graph-based sparse linear discriminant analysis for high-dimensional classification, Forward variable selection for ultra-high dimensional quantile regression models, Quantile forward regression for high-dimensional survival data, Forward selection for feature screening and structure identification in varying coefficient models, A procedure of linear discrimination analysis with detected sparsity structure for high-dimensional multi-class classification, A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems, A semi-parametric approach to feature selection in high-dimensional linear regression models, Sequential feature screening for generalized linear models with sparse ultra-high dimensional data, A primal and dual active set algorithm for truncated \(L_1\) regularized logistic regression, Feature Selection by Canonical Correlation Search in High-Dimensional Multiresponse Models With Complex Group Structures, Which bridge estimator is the best for variable selection?, A sequential approach to feature selection in high-dimensional additive models, Forward regression for Cox models with high-dimensional covariates, The EBIC and a sequential procedure for feature selection in interactive linear models with high-dimensional data, Gini correlation for feature screening, A two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression models, Variable selection in high-dimensional sparse multiresponse linear regression models, A sequential feature selection procedure for high-dimensional Cox proportional hazards model, A sequential scaled pairwise selection approach to edge detection in nonparanormal graphical models
Uses Software
Cites Work
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- FIRST: combining forward iterative selection and shrinkage in high dimensional sparse linear regression
- Improved variable selection with forward-lasso adaptive shrinkage
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- SCAD-penalized regression in high-dimensional partially linear models
- Asymptotics for Lasso-type estimators.
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- High-dimensional graphs and variable selection with the Lasso
- Forward Regression for Ultra-High Dimensional Variable Screening
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- Extended Bayesian information criteria for model selection with large model spaces
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Mathematical Statistics
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Adaptive Forward-Backward Greedy Algorithm for Learning Sparse Representations
- Smoothly Clipped Absolute Deviation on High Dimensions
- New Estimation and Model Selection Procedures for Semiparametric Modeling in Longitudinal Data Analysis