Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso
DOI10.1016/j.jeconom.2017.11.005zbMath1386.62020arXiv1410.4208OpenAlexW2774241305WikidataQ109043099 ScholiaQ109043099MaRDI QIDQ1706454
Mehmet Caner, Anders Bredahl Kock
Publication date: 22 March 2018
Published in: Journal of Econometrics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1410.4208
asymptotic distributionconfidence intervalshigh-dimensional datatestshonest inferenceuniform inferenceconservative Lasso
Asymptotic properties of parametric estimators (62F12) Estimation in multivariate analysis (62H12) Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic distribution theory in statistics (62E20) Nonparametric tolerance and confidence regions (62G15)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Valid post-selection inference
- Statistics for high-dimensional data. Methods, theory and applications.
- On adaptive inference and confidence bands
- One-step sparse estimates in nonconcave penalized likelihood models
- Confidence sets based on sparse estimators are necessarily large
- Honest confidence regions for nonparametric regression
- On the conditions used to prove oracle results for the Lasso
- A significance test for the lasso
- Simultaneous analysis of Lasso and Dantzig selector
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Confidence sets in sparse regression
- Adaptive robust variable selection
- High-dimensional graphs and variable selection with the Lasso
- Strong oracle optimality of folded concave penalized estimation
- Sparse Models and Methods for Optimal Instruments With an Application to Eminent Domain
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Statistical learning and selective inference
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Root-N-Consistent Semiparametric Regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Inference on Treatment Effects after Selection among High-Dimensional Controls
- Power Enhancement in High-Dimensional Cross-Sectional Tests
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Probability Inequalities