Regularity properties for sparse regression
DOI10.1007/S40304-015-0078-6zbMATH Open1341.62212arXiv1305.5198OpenAlexW3124742607WikidataQ39664671 ScholiaQ39664671MaRDI QIDQ279682FDOQ279682
Authors: Edgar Dobriban, Jianqing Fan
Publication date: 29 April 2016
Published in: Communications in Mathematics and Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1305.5198
Recommendations
- Weaker regularity conditions and sparse recovery in high-dimensional regression
- Sparse Regularization via Convex Analysis
- Sparse regular variation
- Nonparametric sparsity and regularization
- scientific article; zbMATH DE number 7750673
- A note on sparse least-squares regression
- An efficient sparse regularity concept
- An efficient sparse regularity concept
- Analysis of Sparse Regularization Based Robust Regression Approaches
- Sparse regression using mixed norms
computational complexityhigh-dimensional statistics\(\ell_q\) sensitivityrestricted eigenvaluesparse regression
Linear regression; mixed models (62J05) Estimation in multivariate analysis (62H12) Computational difficulty of problems (lower bounds, completeness, difficulty of approximation, etc.) (68Q17)
Cites Work
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Restricted eigenvalue properties for correlated Gaussian designs
- Sparsity oracle inequalities for the Lasso
- Optimal solutions for sparse principal component analysis
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Decoding by Linear Programming
- Reconstruction From Anisotropic Random Measurements
- Title not available (Why is that?)
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- The Dantzig selector and sparsity oracle inequalities
- Title not available (Why is that?)
- Uncertainty principles and ideal atomic decomposition
- Computational Complexity
- The Computational Complexity of the Restricted Isometry Property, the Nullspace Property, and Related Concepts in Compressed Sensing
- Atomic decomposition by basis pursuit
- Certifying the Restricted Isometry Property is Hard
- Compressed Sensing and Redundant Dictionaries
- Testing the nullspace property using semidefinite programming
Cited In (9)
- Sparse regularized fuzzy regression
- Design of c-optimal experiments for high-dimensional linear models
- Sparse regression using mixed norms
- An Alternating Method for Cardinality-Constrained Optimization: A Computational Study for the Best Subset Selection and Sparse Portfolio Problems
- Spatial Homogeneity Pursuit of Regression Coefficients for Large Datasets
- On the beta prime prior for scale parameters in high-dimensional Bayesian regression models
- On computationally tractable selection of experiments in measurement-constrained regression models
- Weaker regularity conditions and sparse recovery in high-dimensional regression
- A note on sparse least-squares regression
Uses Software
This page was built for publication: Regularity properties for sparse regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q279682)