The oracle inequalities on simultaneous Lasso and Dantzig selector in high-dimensional nonparametric regression (Q473837)

From MaRDI portal
Revision as of 00:21, 5 March 2024 by Import240304020342 (talk | contribs) (Set profile property.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
The oracle inequalities on simultaneous Lasso and Dantzig selector in high-dimensional nonparametric regression
scientific article

    Statements

    The oracle inequalities on simultaneous Lasso and Dantzig selector in high-dimensional nonparametric regression (English)
    0 references
    0 references
    0 references
    24 November 2014
    0 references
    Summary: During the last few years, a great deal of attention has been focused on Lasso and Dantzig selector in high-dimensional linear regression when the number of variables can be much larger than the sample size. Under a sparsity scenario, earlier papers discussed the relations between Lasso and Dantzig selector and derived sparsity oracle inequalities for the prediction risk and bounds on the \(L_p\) estimation loss. In this paper, we point out that some of the authors overemphasize the role of some sparsity conditions, and the assumptions based on this sparsity condition may cause bad results. We give better assumptions and the methods that avoid using the sparsity condition. As a comparison with the results by \textit{P. J. Bickel} et al. [Ann. Stat. 37, No. 4, 1705--1732 (2009; Zbl 1173.62022)], more precise oracle inequalities for the prediction risk and bounds on the \(L_p\) estimation loss are derived when the number of variables can be much larger than the sample size.
    0 references

    Identifiers