Regularity properties for sparse regression
From MaRDI portal
Abstract: Statistical and machine learning theory has developed several conditions ensuring that popular estimators such as the Lasso or the Dantzig selector perform well in high-dimensional sparse regression, including the restricted eigenvalue, compatibility, and sensitivity properties. However, some of the central aspects of these conditions are not well understood. For instance, it is unknown if these conditions can be checked efficiently on any given data set. This is problematic, because they are at the core of the theory of sparse regression. Here we provide a rigorous proof that these conditions are NP-hard to check. This shows that the conditions are computationally infeasible to verify, and raises some questions about their practical applications. However, by taking an average-case perspective instead of the worst-case view of NP-hardness, we show that a particular condition, sensitivity, has certain desirable properties. This condition is weaker and more general than the others. We show that it holds with high probability in models where the parent population is well behaved, and that it is robust to certain data processing steps. These results are desirable, as they provide guidance about when the condition, and more generally the theory of sparse regression, may be relevant in the analysis of high-dimensional correlated observational data.
Recommendations
- Weaker regularity conditions and sparse recovery in high-dimensional regression
- Sparse Regularization via Convex Analysis
- Sparse regular variation
- Nonparametric sparsity and regularization
- scientific article; zbMATH DE number 7750673
- A note on sparse least-squares regression
- An efficient sparse regularity concept
- An efficient sparse regularity concept
- Analysis of Sparse Regularization Based Robust Regression Approaches
- Sparse regression using mixed norms
Cites work
- scientific article; zbMATH DE number 5787962 (Why is no real title available?)
- scientific article; zbMATH DE number 741240 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- Atomic decomposition by basis pursuit
- Certifying the Restricted Isometry Property is Hard
- Compressed Sensing and Redundant Dictionaries
- Computational Complexity
- Decoding by Linear Programming
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- On the conditions used to prove oracle results for the Lasso
- Optimal solutions for sparse principal component analysis
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- Reconstruction From Anisotropic Random Measurements
- Restricted eigenvalue properties for correlated Gaussian designs
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- Statistics for high-dimensional data. Methods, theory and applications.
- Testing the nullspace property using semidefinite programming
- The Computational Complexity of the Restricted Isometry Property, the Nullspace Property, and Related Concepts in Compressed Sensing
- The Dantzig selector and sparsity oracle inequalities
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Uncertainty principles and ideal atomic decomposition
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(9)- Sparse regression using mixed norms
- Spatial Homogeneity Pursuit of Regression Coefficients for Large Datasets
- On the beta prime prior for scale parameters in high-dimensional Bayesian regression models
- Design of c-optimal experiments for high-dimensional linear models
- Sparse regularized fuzzy regression
- A note on sparse least-squares regression
- Weaker regularity conditions and sparse recovery in high-dimensional regression
- On computationally tractable selection of experiments in measurement-constrained regression models
- An Alternating Method for Cardinality-Constrained Optimization: A Computational Study for the Best Subset Selection and Sparse Portfolio Problems
This page was built for publication: Regularity properties for sparse regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q279682)