Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
From MaRDI portal
Publication:366968
DOI10.1214/13-AOS1106zbMath1293.62153arXiv1307.1952WikidataQ57426703 ScholiaQ57426703MaRDI QIDQ366968
Soumendra Nath Lahiri, Arindam Chatterjee
Publication date: 25 September 2013
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1307.1952
Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic distribution theory in statistics (62E20) Nonparametric statistical resampling methods (62G09)
Related Items (34)
Markov Neighborhood Regression for High-Dimensional Inference ⋮ Confidence intervals for high-dimensional partially linear single-index models ⋮ Post-model-selection inference in linear regression models: an integrated review ⋮ Thresholding least-squares inference in high-dimensional regression models ⋮ Projection-based Inference for High-dimensional Linear Models ⋮ Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models ⋮ Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap ⋮ Monte Carlo Simulation for Lasso-Type Problems by Estimator Augmentation ⋮ Bootstrap confidence regions based on M-estimators under nonstandard conditions ⋮ Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression ⋮ Hierarchical inference for genome-wide association studies: a view on methodology with software ⋮ Debiasing the debiased Lasso with bootstrap ⋮ Testing stochastic dominance with many conditioning variables ⋮ High-dimensional simultaneous inference with the bootstrap ⋮ Inference for sparse linear regression based on the leave-one-covariate-out solution path ⋮ Second order correctness of perturbation bootstrap M-estimator of multiple linear regression parameter ⋮ Relaxing the assumptions of knockoffs by conditioning ⋮ A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models ⋮ Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model ⋮ On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property ⋮ Variable selection in the Box-Cox power transformation model ⋮ Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space ⋮ Beyond support in two-stage variable selection ⋮ Can we trust the bootstrap in high-dimension? ⋮ On asymptotically optimal confidence regions and tests for high-dimensional models ⋮ Confidence intervals for high-dimensional inverse covariance estimation ⋮ High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi} ⋮ Bootstrapping and sample splitting for high-dimensional, assumption-lean inference ⋮ Honest confidence regions and optimality in high-dimensional precision matrix estimation ⋮ INFERENCE AFTER MODEL AVERAGING IN LINEAR REGRESSION MODELS ⋮ Perturbation bootstrap in adaptive Lasso ⋮ Confidence intervals for parameters in high-dimensional sparse vector autoregression ⋮ Comments on: ``High-dimensional simultaneous inference with the bootstrap ⋮ Bootstrap inference for penalized GMM estimators with oracle properties
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Strong consistency of Lasso estimators
- Valid post-selection inference
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Approximations for multivariate U-statistics
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- On the distribution of the adaptive LASSO estimator
- Bootstrapping regression models
- On the validity of the formal Edgeworth expansion
- Bootstrap methods: another look at the jackknife
- Asymptotics for Lasso-type estimators.
- Simultaneous analysis of Lasso and Dantzig selector
- A note on the asymptotic distribution of lasso estimator for correlated data
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Sparsity oracle inequalities for the Lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Asymptotic properties of the residual bootstrap for Lasso estimators
- Bootstrapping Lasso Estimators
- A Perturbation Method for Inference on Regularized Regression Estimates
- Model selection and estimation in the Gaussian graphical model
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- The bootstrap and Edgeworth expansion
This page was built for publication: Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap