Consistent tuning parameter selection in high dimensional sparse linear regression
From MaRDI portal
Publication:548648
DOI10.1016/j.jmva.2011.03.007zbMath1216.62103OpenAlexW2078536949MaRDI QIDQ548648
Publication date: 29 June 2011
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2011.03.007
Bayesian information criterionhigh dimensionalityvariable selectionsure independence screeningadaptive elastic net
Estimation in multivariate analysis (62H12) Asymptotic properties of nonparametric inference (62G20) Linear regression; mixed models (62J05) Bayesian inference (62F15)
Related Items
A robust and efficient variable selection method for linear regression ⋮ Penalized estimation of threshold auto-regressive models with many components and thresholds ⋮ Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters ⋮ Regularized latent class analysis with application in cognitive diagnosis ⋮ Globally adaptive quantile regression with ultra-high dimensional data ⋮ A Unified Framework for Change Point Detection in High-Dimensional Linear Models ⋮ Variables selection using \(\mathcal{L}_0\) penalty ⋮ A modified information criterion for tuning parameter selection in 1d fused LASSO for inference on multiple change points ⋮ Cross-Validation With Confidence ⋮ Smooth predictive model fitting in regression ⋮ Model selection in sparse high-dimensional vine copula models with an application to portfolio risk ⋮ Sparse group fused Lasso for model segmentation: a hybrid approach ⋮ Tuning parameter selection for penalised empirical likelihood with a diverging number of parameters
Cites Work
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Estimating the dimension of a model
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- The risk inflation criterion for multiple regression
- On the adaptive elastic net with a diverging number of parameters
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Forward Regression for Ultra-High Dimensional Variable Screening
- Model selection in irregular problems: Applications to mapping quantitative trait loci
- Extended Bayesian information criteria for model selection with large model spaces
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Ideal spatial adaptation by wavelet shrinkage
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- A Model Selection Approach for the Identification of Quantitative Trait Loci in Experimental Crosses
- Regularization and Variable Selection Via the Elastic Net
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution