Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)

From MaRDI portal
Revision as of 09:14, 8 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4975847

DOI10.1109/TIT.2009.2016018zbMath1367.62220MaRDI QIDQ4975847

Martin J. Wainwright

Publication date: 8 August 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)




Related Items (only showing first 100 items - show all)

Investigating competition in financial markets: a sparse autologistic model for dynamic network dataVariable Selection With Second-Generation P-ValuesAn algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategyRecovering Structured Signals in Noise: Least-Squares Meets Compressed SensingCompressive Classification: Where Wireless Communications Meets Machine LearningThe Noise Collector for sparse recovery in high dimensionsNonnegative elastic net and application in index trackingHigh-dimensional change-point estimation: combining filtering with convex optimizationUnnamed ItemRates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse modelsLarge-scale multivariate sparse regression with applications to UK BiobankAdaptive Bayesian SLOPE: Model Selection With Incomplete DataRobust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errorsMonte Carlo Simulation for Lasso-Type Problems by Estimator AugmentationA convex optimization framework for the identification of homogeneous reaction systemsAdaptive multi-penalty regularization based on a generalized Lasso pathStatistical inference for model parameters in stochastic gradient descentSparse high-dimensional regression: exact scalable algorithms and phase transitionsPerspective functions: proximal calculus and applications in high-dimensional statisticsOn estimation error bounds of the Elastic Net when pnA robust high dimensional estimation of a finite mixture of the generalized linear modelReview of Bayesian selection methods for categorical predictors using JAGSHigh-dimensional dynamic systems identification with additional constraintsPairwise sparse + low-rank models for variables of mixed typeQuadratic growth conditions and uniqueness of optimal solution to LassoAsymptotic Theory of \(\boldsymbol \ell _1\) -Regularized PDE Identification from a Single Noisy TrajectoryL1-norm-based principal component analysis with adaptive regularizationPredictor ranking and false discovery proportion control in high-dimensional regressionOnline sparse identification for regression modelsLearning rates for partially linear functional models with high dimensional scalar covariatesDebiasing the debiased Lasso with bootstrapUnnamed ItemA simple homotopy proximal mapping algorithm for compressive sensingWhen Ramanujan meets time-frequency analysis in complicated time series analysisSupport union recovery in high-dimensional multivariate regressionStatistical analysis of sparse approximate factor modelsRecovery of partly sparse and dense signalsFundamental limits of exact support recovery in high dimensionsMulti-stage convex relaxation for feature selectionOptimal Sparse Linear Prediction for Block-missing Multi-modality Data Without ImputationRIPless compressed sensing from anisotropic measurementsSparse directed acyclic graphs incorporating the covariatesA relaxed-PPA contraction method for sparse signal recoveryWhich bridge estimator is the best for variable selection?Independently Interpretable Lasso for Generalized Linear ModelsAn unbiased approach to compressed sensingEstimation and variable selection with exponential weightsA Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear ModelsConsistency of \(\ell_1\) recovery from noisy deterministic measurementsSparse regression: scalable algorithms and empirical performanceA discussion on practical considerations with sparse regression methodologiesA look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al.Rejoinder: ``Sparse regression: scalable algorithms and empirical performanceA Tuning-free Robust and Efficient Approach to High-dimensional RegressionA framework for solving mixed-integer semidefinite programsAdaptive Huber RegressionCHAOTIC ANALOG-TO-INFORMATION CONVERSION: PRINCIPLE AND RECONSTRUCTABILITY WITH PARAMETER IDENTIFIABILITYA significance test for the lassoDiscussion: ``A significance test for the lassoRejoinder: ``A significance test for the lassoPivotal estimation via square-root lasso in nonparametric regressionTruncated $L^1$ Regularized Linear Regression: Theory and AlgorithmA Tight Bound of Hard ThresholdingHigh-Dimensional Sparse Additive Hazards RegressionSparse semiparametric discriminant analysisHigh-dimensional variable screening and bias in subsequent inference, with an empirical comparisonIterative reweighted noninteger norm regularizing SVM for gene expression data classificationConfidence Intervals for Low Dimensional Parameters in High Dimensional Linear ModelsA global homogeneity test for high-dimensional linear regressionA numerical exploration of compressed sampling recoveryUnnamed ItemPrediction error bounds for linear regression with the TREXUnnamed ItemUnnamed ItemBoosting with structural sparsity: a differential inclusion approachPrediction and estimation consistency of sparse multi-class penalized optimal scoringVariable selection via adaptive false negative control in linear regressionSorted concave penalized regressionMinimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factorsLow Complexity Regularization of Linear Inverse ProblemsBAYESIAN HYPER-LASSOS WITH NON-CONVEX PENALIZATIONModel Selection for High-Dimensional Quadratic Regression via RegularizationVariable Selection for Nonparametric Learning with Power Series KernelsApproximate support recovery of atomic line spectral estimation: a tale of resolution and precisionHigh-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and rankingA two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression modelsAsymptotic theory of the adaptive sparse group LassoSimple expressions of the LASSO and SLOPE estimators in low-dimensionUnnamed ItemA Mixed-Integer Fractional Optimization Approach to Best Subset SelectionOn the Use of the Lasso for Instrumental Variables Estimation with Some Invalid InstrumentsUnnamed ItemUnnamed ItemRobust controllability assessment and optimal actuator placement in dynamic networksSparsistency and agnostic inference in sparse PCAStability SelectionOn model selection consistency of regularized M-estimatorsLasso penalized semiparametric regression on high-dimensional recurrent event data via coordinate descentMinimax-optimal nonparametric regression in high dimensionsSparse learning via Boolean relaxations







This page was built for publication: Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)