Weaker regularity conditions and sparse recovery in high-dimensional regression (Q2336858): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Created claim: DBLP publication ID (P1635): journals/jam/WangSS14, #quickstatements; #temporary_batch_1731543907597
 
(6 intermediate revisions by 6 users not shown)
Property / Wikidata QID
 
Property / Wikidata QID: Q59054411 / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: PDCO / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1155/2014/946241 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2019255421 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4864293 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Atomic Decomposition by Basis Pursuit / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder). / rank
 
Normal rank
Property / cites work
 
Property / cites work: Statistics for high-dimensional data. Methods, theory and applications. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stable signal recovery from incomplete and inaccurate measurements / rank
 
Normal rank
Property / cites work
 
Property / cites work: The oracle inequalities on simultaneous Lasso and Dantzig selector in high-dimensional nonparametric regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the conditions used to prove oracle results for the Lasso / rank
 
Normal rank
Property / cites work
 
Property / cites work: Recovery of high-dimensional sparse signals via \(\ell_1\)-minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Simultaneous analysis of Lasso and Dantzig selector / rank
 
Normal rank
Property / cites work
 
Property / cites work: Decoding by Linear Programming / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalization of constraints for high dimensional regression problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3174050 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Restricted isometry property of matrices with independent columns and neighborly polytopes by random sampling / rank
 
Normal rank
Property / cites work
 
Property / cites work: Uniform uncertainty principle for Bernoulli and subgaussian ensembles / rank
 
Normal rank
Property / cites work
 
Property / cites work: A simple proof of the restricted isometry property for random matrices / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Dantzig selector and sparsity oracle inequalities / rank
 
Normal rank
Property / cites work
 
Property / cites work: A remark on the Lasso and the Dantzig selector / rank
 
Normal rank
Property / DBLP publication ID
 
Property / DBLP publication ID: journals/jam/WangSS14 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 01:46, 14 November 2024

scientific article
Language Label Description Also known as
English
Weaker regularity conditions and sparse recovery in high-dimensional regression
scientific article

    Statements

    Weaker regularity conditions and sparse recovery in high-dimensional regression (English)
    0 references
    0 references
    0 references
    0 references
    19 November 2019
    0 references
    Summary: Regularity conditions play a pivotal role for sparse recovery in high-dimensional regression. In this paper, we present a weaker regularity condition and further discuss the relationships with other regularity conditions, such as restricted eigenvalue condition. We study the behavior of our new condition for design matrices with independent random columns uniformly drawn on the unit sphere. Moreover, the present paper shows that, under a sparsity scenario, the Lasso estimator and Dantzig selector exhibit similar behavior. Based on both methods, we derive, in parallel, more precise bounds for the estimation loss and the prediction risk in the linear regression model when the number of variables can be much larger than the sample size.
    0 references

    Identifiers