Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
From MaRDI portal
Publication:389956
DOI10.1214/14-EJS875zbMath1281.62158arXiv1306.5505MaRDI QIDQ389956
Publication date: 22 January 2014
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1306.5505
asymptotic normality; sparsity; asymptotic unbiasedness; irrepresentable condition; Lasso+Ridge; residual bootstrap
62F12: Asymptotic properties of parametric estimators
62J07: Ridge regression; shrinkage estimators (Lasso)
62F40: Bootstrap, jackknife and other resampling methods
Related Items
Unnamed Item, Projection-based Inference for High-dimensional Linear Models, Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models, Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization, A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models, Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model, Comments on: ``High-dimensional simultaneous inference with the bootstrap, Markov Neighborhood Regression for High-Dimensional Inference, Statistical Inference for High-Dimensional Models via Recursive Online-Score Estimation, Beyond support in two-stage variable selection, High-dimensional simultaneous inference with the bootstrap, Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, Confidence intervals for parameters in high-dimensional sparse vector autoregression, Post-model-selection inference in linear regression models: an integrated review, Ridge regression revisited: debiasing, thresholding and bootstrap, Change points detection and parameter estimation for multivariate time series, Random weighting in LASSO regression, Bootstrap confidence regions based on M-estimators under nonstandard conditions, Hierarchical inference for genome-wide association studies: a view on methodology with software, Lasso meets horseshoe: a survey
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A unified approach to model selection and sparse recovery using regularized least squares
- The Adaptive Lasso and Its Oracle Properties
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Relaxed Lasso
- On the distribution of the adaptive LASSO estimator
- Some asymptotic theory for the bootstrap
- Bootstrapping regression models
- Asymptotic expansions for the power of distributionfree tests in the two- sample problem
- Bootstrap methods: another look at the jackknife
- Contributions of empirical and quantile processes to the asymptotic theory of goodness-of-fit tests. (With comments)
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Asymptotics for Lasso-type estimators.
- Least angle regression. (With discussion)
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- Least squares after model selection in high-dimensional sparse models
- Simultaneous analysis of Lasso and Dantzig selector
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Pathwise coordinate optimization
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Bootstrapping Lasso Estimators
- A Perturbation Method for Inference on Regularized Regression Estimates
- Stable recovery of sparse overcomplete representations in the presence of noise
- Recovery of Exact Sparse Representations in the Presence of Bounded Noise
- Greed is Good: Algorithmic Results for Sparse Approximation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Model selection consistency of Dantzig selector
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Regularization and Variable Selection Via the Elastic Net
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- MODEL SELECTION AND INFERENCE: FACTS AND FICTION
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers