The benefit of group sparsity in group inference with de-biased scaled group Lasso
DOI10.1214/16-EJS1120zbMATH Open1397.62261arXiv1412.4170OpenAlexW2963304125MaRDI QIDQ309547FDOQ309547
Publication date: 7 September 2016
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1412.4170
Recommendations
- Group Bound: Confidence Intervals for Groups of Variables in Sparse High Dimensional Regression Without Assumptions on the Design
- The benefit of group sparsity
- Group inference in high dimensions with applications to hierarchical testing
- Debiasing the Lasso: optimal sample size for Gaussian designs
- The de-biased group Lasso estimation for varying coefficient models
asymptotic normalitybias correctiongroup Lassogroup inferencechi-squared distributionconfidence regionrelaxed projection
Nonparametric tolerance and confidence regions (62G15) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Confidence intervals for high-dimensional inverse covariance estimation
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- p-Values for High-Dimensional Regression
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Model Selection and Estimation in Regression with Grouped Variables
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Asymptotics for Lasso-type estimators.
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Scaled sparse linear regression
- Statistical significance in high-dimensional linear models
- Title not available (Why is that?)
- \(\ell_{1}\)-penalization for mixture regression models
- Hanson-Wright inequality and sub-Gaussian concentration
- High-dimensional variable selection
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- A selective review of group selection in high-dimensional models
- Worst possible sub-directions in high-dimensional models
- Sparse Matrix Inversion with Scaled Lasso
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- Decoding by Linear Programming
- Sparse principal component analysis and iterative thresholding
- A group bridge approach for variable selection
- Can one estimate the conditional distribution of post-model-selection estimators?
- Title not available (Why is that?)
- The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
- Oracle inequalities and optimal inference under group sparsity
- The benefit of group sparsity
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- Quantile regression for competing risks data with missing cause of failure
- Comments on: \(\ell _{1}\)-penalization for mixture regression models
- The Dantzig selector and sparsity oracle inequalities
- Comments on: \(\ell_{1}\)-penalization for mixture regression models
- On the asymptotic properties of the group lasso estimator for linear models
- Rejoinder: ``A significance test for the lasso
- Spectral norm of products of random and deterministic matrices
- Support union recovery in high-dimensional multivariate regression
- Shifting Inequality and Recovery of Sparse Signals
Cited In (21)
- Statistical inference in sparse high-dimensional additive models
- The benefit of group sparsity
- Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference
- Generalized matrix decomposition regression: estimation and inference for two-way structured data
- An Interactive Greedy Approach to Group Sparsity in High Dimensions
- Title not available (Why is that?)
- Covariate-adjusted inference for differential analysis of high-dimensional networks
- Generalized linear models with structured sparsity estimators
- Constructing confidence intervals for the signals in sparse phase retrieval
- Multivariate log-contrast regression with sub-compositional predictors: testing the association between preterm infants' gut microbiome and neurobehavioral outcomes
- Multicarving for high-dimensional post-selection inference
- Linear Hypothesis Testing in Dense High-Dimensional Linear Models
- Group Inference in High Dimensions with Applications to Hierarchical Testing
- Double bias correction for high-dimensional sparse additive hazards regression with covariate measurement errors
- Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap
- Title not available (Why is that?)
- AIC for the group Lasso in generalized linear models
- Tuning-Free Heterogeneity Pursuit in Massive Networks
- Spatially relaxed inference on high-dimensional linear models
- Moving force identification based on group Lasso and compressed sensing
- The de-biased group Lasso estimation for varying coefficient models
Uses Software
This page was built for publication: The benefit of group sparsity in group inference with de-biased scaled group Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q309547)