Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using \ell _{1}-Constrained Quadratic Programming (Lasso)
From MaRDI portal
Publication:4975847
DOI10.1109/TIT.2009.2016018zbMATH Open1367.62220MaRDI QIDQ4975847FDOQ4975847
Authors: Martin J. Wainwright
Publication date: 8 August 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Quadratic programming (90C20) Ridge regression; shrinkage estimators (Lasso) (62J07) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Cited In (only showing first 100 items - show all)
- A framework for solving mixed-integer semidefinite programs
- Optimal false discovery control of minimax estimators
- A Mixed-Integer Fractional Optimization Approach to Best Subset Selection
- High-dimensional linear regression with hard thresholding regularization: theory and algorithm
- An unbiased approach to compressed sensing
- LASSO risk and phase transition under dependence
- Simple expressions of the Lasso and SLOPE estimators in low-dimension
- A global homogeneity test for high-dimensional linear regression
- Revisiting feature selection for linear models with FDR and power guarantees
- Regularized estimation of high-dimensional factor-augmented vector autoregressive (FAVAR) models
- Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates
- Sparse principal component based high-dimensional mediation analysis
- Exploiting prior knowledge in compressed sensing to design robust systems for endoscopy image recovery
- Boosting with structural sparsity: a differential inclusion approach
- Asymptotic theory of \(\ell_1\)-regularized PDE identification from a single noisy trajectory
- A look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al.
- Recovering structured signals in noise: least-squares meets compressed sensing
- A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates
- Title not available (Why is that?)
- Approximate support recovery of atomic line spectral estimation: a tale of resolution and precision
- Fundamental limits of exact support recovery in high dimensions
- Statistical inference for model parameters in stochastic gradient descent
- Which bridge estimator is the best for variable selection?
- Model selection with mixed variables on the Lasso path
- Consistent multiple changepoint estimation with fused Gaussian graphical models
- Iteratively reweighted \(\ell_1\)-penalized robust regression
- Sparse learning via Boolean relaxations
- A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations
- Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors
- Evaluating visual properties via robust HodgeRank
- Predictor ranking and false discovery proportion control in high-dimensional regression
- In defense of the indefensible: a very naïve approach to high-dimensional inference
- A convex optimization framework for the identification of homogeneous reaction systems
- Adaptive multi-penalty regularization based on a generalized Lasso path
- Discussion: Latent variable graphical model selection via convex optimization
- Discussion: Latent variable graphical model selection via convex optimization
- Variable selection with Hamming loss
- L1-norm-based principal component analysis with adaptive regularization
- Optimal sparse linear prediction for block-missing multi-modality data without imputation
- Provable training set debugging for linear regression
- Statistical analysis of sparse approximate factor models
- Feature selection for data integration with mixed multiview data
- The all-or-nothing phenomenon in sparse linear regression
- Spatially relaxed inference on high-dimensional linear models
- De-biasing the Lasso with degrees-of-freedom adjustment
- Asymptotic theory of the adaptive sparse group Lasso
- Robust controllability assessment and optimal actuator placement in dynamic networks
- Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors
- Truncated \(L_1\) regularized linear regression: theory and algorithm
- Bayesian group selection in logistic regression with application to MRI data analysis
- Title not available (Why is that?)
- Information criteria bias correction for group selection
- Sparse quadratic classification rules via linear dimension reduction
- Adjusting for high-dimensional covariates in sparse precision matrix estimation by \(\ell_1\)-penalization
- Model selection for high-dimensional quadratic regression via regularization
- Estimating time-varying networks
- The adaptive Lasso in high-dimensional sparse heteroscedastic models
- Detection boundary in sparse regression
- On the conditions used to prove oracle results for the Lasso
- Least squares after model selection in high-dimensional sparse models
- Sign-constrained least squares estimation for high-dimensional regression
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- RIPless compressed sensing from anisotropic measurements
- Sparse directed acyclic graphs incorporating the covariates
- On semidefinite relaxations for the block model
- Discussion: Latent variable graphical model selection via convex optimization
- A significance test for the lasso
- Bootstrap inference for network construction with an application to a breast cancer microarray study
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Sparse semiparametric discriminant analysis
- Rejoinder: Latent variable graphical model selection via convex optimization
- High-dimensional change-point estimation: combining filtering with convex optimization
- The Lasso problem and uniqueness
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- On constrained and regularized high-dimensional regression
- Sparse classification: a scalable discrete optimization perspective
- Discussion: ``A significance test for the lasso
- Minimax-optimal nonparametric regression in high dimensions
- High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking
- Discussion: Latent variable graphical model selection via convex optimization
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Estimation and variable selection with exponential weights
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
- Prediction and estimation consistency of sparse multi-class penalized optimal scoring
- Improved variable selection with forward-lasso adaptive shrinkage
- On model selection consistency of regularized M-estimators
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- Two are better than one: fundamental parameters of frame coherence
- Nonnegative-Lasso and application in index tracking
- A discussion on practical considerations with sparse regression methodologies
- A new perspective on least squares under convex constraint
- On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property
- AIC for the Lasso in generalized linear models
- A numerical exploration of compressed sampling recovery
- A posterior probability approach for gene regulatory network inference in genetic perturbation data
- Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error
- Factor-Adjusted Regularized Model Selection
- Nonnegative elastic net and application in index tracking
- Rejoinder: ``A significance test for the lasso
This page was built for publication: Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4975847)