Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using \ell _{1}-Constrained Quadratic Programming (Lasso)
From MaRDI portal
Publication:4975847
DOI10.1109/TIT.2009.2016018zbMATH Open1367.62220MaRDI QIDQ4975847FDOQ4975847
Authors: Martin J. Wainwright
Publication date: 8 August 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Quadratic programming (90C20) Ridge regression; shrinkage estimators (Lasso) (62J07) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Cited In (only showing first 100 items - show all)
- A framework for solving mixed-integer semidefinite programs
- Optimal false discovery control of minimax estimators
- A Mixed-Integer Fractional Optimization Approach to Best Subset Selection
- High-dimensional linear regression with hard thresholding regularization: theory and algorithm
- An unbiased approach to compressed sensing
- LASSO risk and phase transition under dependence
- Simple expressions of the Lasso and SLOPE estimators in low-dimension
- A global homogeneity test for high-dimensional linear regression
- Revisiting feature selection for linear models with FDR and power guarantees
- Regularized estimation of high-dimensional factor-augmented vector autoregressive (FAVAR) models
- Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates
- Sparse principal component based high-dimensional mediation analysis
- Exploiting prior knowledge in compressed sensing to design robust systems for endoscopy image recovery
- Boosting with structural sparsity: a differential inclusion approach
- Asymptotic theory of \(\ell_1\)-regularized PDE identification from a single noisy trajectory
- A look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al.
- Recovering structured signals in noise: least-squares meets compressed sensing
- A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates
- Title not available (Why is that?)
- Approximate support recovery of atomic line spectral estimation: a tale of resolution and precision
- Fundamental limits of exact support recovery in high dimensions
- Statistical inference for model parameters in stochastic gradient descent
- Which bridge estimator is the best for variable selection?
- Model selection with mixed variables on the Lasso path
- Consistent multiple changepoint estimation with fused Gaussian graphical models
- Iteratively reweighted \(\ell_1\)-penalized robust regression
- Sparse learning via Boolean relaxations
- A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations
- Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors
- Evaluating visual properties via robust HodgeRank
- Predictor ranking and false discovery proportion control in high-dimensional regression
- In defense of the indefensible: a very naïve approach to high-dimensional inference
- A convex optimization framework for the identification of homogeneous reaction systems
- Adaptive multi-penalty regularization based on a generalized Lasso path
- Discussion: Latent variable graphical model selection via convex optimization
- Discussion: Latent variable graphical model selection via convex optimization
- Variable selection with Hamming loss
- L1-norm-based principal component analysis with adaptive regularization
- Optimal sparse linear prediction for block-missing multi-modality data without imputation
- Provable training set debugging for linear regression
- Statistical analysis of sparse approximate factor models
- Feature selection for data integration with mixed multiview data
- The all-or-nothing phenomenon in sparse linear regression
- Spatially relaxed inference on high-dimensional linear models
- De-biasing the Lasso with degrees-of-freedom adjustment
- Asymptotic theory of the adaptive sparse group Lasso
- Robust controllability assessment and optimal actuator placement in dynamic networks
- Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors
- Truncated \(L_1\) regularized linear regression: theory and algorithm
- Bayesian group selection in logistic regression with application to MRI data analysis
- Title not available (Why is that?)
- Information criteria bias correction for group selection
- Sparse quadratic classification rules via linear dimension reduction
- Recovery of partly sparse and dense signals
- Sharp recovery bounds for convex demixing, with applications
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Learning high-dimensional Gaussian linear structural equation models with heterogeneous error variances
- A reproducing kernel Hilbert space approach to high dimensional partially varying coefficient model
- Penalized logspline density estimation using total variation penalty
- Sparse regression: scalable algorithms and empirical performance
- Sorted concave penalized regression
- Bayesian augmented Lagrangian algorithm for system identification
- Multi-stage convex relaxation for feature selection
- A general family of trimmed estimators for robust high-dimensional data analysis
- Adaptive log-density estimation
- The generalized Lasso problem and uniqueness
- Model selection consistency of Lasso for empirical data
- Generalized Kalman smoothing: modeling and algorithms
- Discussion: Latent variable graphical model selection via convex optimization
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data
- Oracle inequalities for high-dimensional prediction
- Low complexity regularization of linear inverse problems
- Discussion: Latent variable graphical model selection via convex optimization
- A tight bound of hard thresholding
- A tuning-free robust and efficient approach to high-dimensional regression
- A two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression models
- Prediction error bounds for linear regression with the TREX
- Sparse recovery via differential inclusions
- Subspace clustering of high-dimensional data: a predictive approach
- Perspective functions: proximal calculus and applications in high-dimensional statistics
- Variable selection, monotone likelihood ratio and group sparsity
- Sparse linear models and \(l_1\)-regularized 2SLS with high-dimensional endogenous regressors and instruments
- Clustering High-Dimensional Data via Feature Selection
- Sparse high-dimensional linear regression. Estimating squared error and a phase transition
- On the sign consistency of the Lasso for the high-dimensional Cox model
- Multivariate factorizable expectile regression with application to fMRI data
- Adaptive Huber Regression
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- Robust inference for high‐dimensional single index models
- CAM: causal additive models, high-dimensional order search and penalized regression
- Sparse hierarchical regression with polynomials
- Sparse approximation over the cube
- A relaxed-PPA contraction method for sparse signal recovery
- Sparse Laplacian shrinkage with the graphical Lasso estimator for regression problems
- Penalized wavelet estimation and robust denoising for irregular spaced data
This page was built for publication: Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4975847)