Boosting with structural sparsity: a differential inclusion approach (Q2278448): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Import241208061232 (talk | contribs)
Normalize DOI.
 
(3 intermediate revisions by 3 users not shown)
Property / DOI
 
Property / DOI: 10.1016/j.acha.2017.12.004 / rank
Normal rank
 
Property / OpenAlex ID
 
Property / OpenAlex ID: W2624594050 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1704.04833 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Simultaneous analysis of Lasso and Dantzig selector / rank
 
Normal rank
Property / cites work
 
Property / cites work: Boosting With the<i>L</i><sub>2</sub>Loss / rank
 
Normal rank
Property / cites work
 
Property / cites work: Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise / rank
 
Normal rank
Property / cites work
 
Property / cites work: Least angle regression. (With discussion) / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Split Bregman Method for L1-Regularized Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4897663 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparse recovery via differential inclusions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonlinear total variation based noise removal algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4864293 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparsity and Smoothness Via the Fused Lasso / rank
 
Normal rank
Property / cites work
 
Property / cites work: The solution path of the generalized lasso / rank
 
Normal rank
Property / cites work
 
Property / cites work: Greed is Good: Algorithmic Results for Sparse Approximation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust Sparse Analysis Regularization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso) / rank
 
Normal rank
Property / cites work
 
Property / cites work: On early stopping in gradient descent learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Split Bregman method for large scale fused Lasso / rank
 
Normal rank
Property / cites work
 
Property / cites work: Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Non-Negative Garrotte Estimator / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3174050 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Adaptive Lasso and Its Oracle Properties / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.1016/J.ACHA.2017.12.004 / rank
 
Normal rank

Latest revision as of 19:15, 17 December 2024

scientific article
Language Label Description Also known as
English
Boosting with structural sparsity: a differential inclusion approach
scientific article

    Statements

    Boosting with structural sparsity: a differential inclusion approach (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    5 December 2019
    0 references
    boosting
    0 references
    differential inclusions
    0 references
    structural sparsity
    0 references
    linearized Bregman iteration
    0 references
    variable splitting
    0 references
    generalized Lasso
    0 references
    model selection
    0 references
    consistency
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references