Weaker regularity conditions and sparse recovery in high-dimensional regression
From MaRDI portal
Publication:2336858
DOI10.1155/2014/946241zbMath1442.62155DBLPjournals/jam/WangSS14OpenAlexW2019255421WikidataQ59054411 ScholiaQ59054411MaRDI QIDQ2336858
Limin Su, Shi-Qing Wang, Yan Shi
Publication date: 19 November 2019
Published in: Journal of Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2014/946241
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Recovery of high-dimensional sparse signals via \(\ell_1\)-minimization
- The oracle inequalities on simultaneous Lasso and Dantzig selector in high-dimensional nonparametric regression
- Statistics for high-dimensional data. Methods, theory and applications.
- The Dantzig selector and sparsity oracle inequalities
- Restricted isometry property of matrices with independent columns and neighborly polytopes by random sampling
- Generalization of constraints for high dimensional regression problems
- A simple proof of the restricted isometry property for random matrices
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
- A remark on the Lasso and the Dantzig selector
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Decoding by Linear Programming
- Atomic Decomposition by Basis Pursuit
- Stable signal recovery from incomplete and inaccurate measurements
This page was built for publication: Weaker regularity conditions and sparse recovery in high-dimensional regression