A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models
From MaRDI portal
Publication:5134479
DOI10.5705/ss.202018.0131zbMath1453.62581arXiv1706.02150OpenAlexW2964037952WikidataQ129423648 ScholiaQ129423648MaRDI QIDQ5134479
Hanzhong Liu, Jingyi Jessica Li, Xin Xu
Publication date: 16 November 2020
Published in: Statistica Sinica (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1706.02150
bootstrapconfidence intervalhigh-dimensional inferencemodel selection consistencyLasso+partial ridge
Parametric tolerance and confidence regions (62F25) Ridge regression; shrinkage estimators (Lasso) (62J07) Bootstrap, jackknife and other resampling methods (62F40)
Related Items
IFAA: Robust Association Identification and Inference for Absolute Abundance in Microbiome Analyses ⋮ Automatic variable selection in a linear model on massive data ⋮ Statistical Inference for High-Dimensional Models via Recursive Online-Score Estimation ⋮ Weak Signal Identification and Inference in Penalized Likelihood Models for Categorical Responses ⋮ Linear and nonlinear signal detection and estimation in high-dimensional nonparametric regression under weak sparsity ⋮ HDCI
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- A lava attack on the recovery of sums of dense and sparse signals
- Exact post-selection inference, with application to the Lasso
- Valid post-selection inference
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Statistical significance in high-dimensional linear models
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- Statistics for high-dimensional data. Methods, theory and applications.
- Controlling the false discovery rate via knockoffs
- High-dimensional simultaneous inference with the bootstrap
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Asymptotics for Lasso-type estimators.
- Weak signal identification and inference in penalized model selection
- A significance test for the lasso
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Asymptotic properties of the residual bootstrap for Lasso estimators
- Bootstrapping Lasso Estimators
- A Perturbation Method for Inference on Regularized Regression Estimates
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Inference on Treatment Effects after Selection among High-Dimensional Controls
- Post selection shrinkage estimation for high‐dimensional data analysis
- Stability Selection
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Group Bound: Confidence Intervals for Groups of Variables in Sparse High Dimensional Regression Without Assumptions on the Design
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers