I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error (Q1750288): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Changed an Item
Set OpenAlex properties.
 
(3 intermediate revisions by 3 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 1507.01037 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fast global convergence of gradient methods for high-dimensional statistical recovery / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Least squares after model selection in high-dimensional sparse models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Simultaneous analysis of Lasso and Dantzig selector / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4821526 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection / rank
 
Normal rank
Property / cites work
 
Property / cites work: Statistics for high-dimensional data. Methods, theory and applications. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparsity oracle inequalities for the Lasso / rank
 
Normal rank
Property / cites work
 
Property / cites work: Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sure Independence Screening for Ultrahigh Dimensional Feature Space / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonconcave Penalized Likelihood With NP-Dimensionality / rank
 
Normal rank
Property / cites work
 
Property / cites work: Strong oracle optimality of folded concave penalized estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error / rank
 
Normal rank
Property / cites work
 
Property / cites work: Pathwise coordinate optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smoothly Clipped Absolute Deviation on High Dimensions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global optimality of nonconvex penalized estimators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Support recovery without incoherence: a case for nonconvex regularization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum distance Lasso for robust high-dimensional regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient methods for minimizing composite functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2896143 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4864293 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the conditions used to prove oracle results for the Lasso / rank
 
Normal rank
Property / cites work
 
Property / cites work: Calibrating nonconvex penalized regression in ultra-high dimension / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal computational and statistical rates of convergence for sparse nonconvex learning problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some sharp performance bounds for least squares regression with \(L_1\) regularization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nearly unbiased variable selection under minimax concave penalty / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2896066 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A general theory of concave regularization for high-dimensional sparse estimation problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Adaptive Lasso and Its Oracle Properties / rank
 
Normal rank
Property / cites work
 
Property / cites work: One-step sparse estimates in nonconcave penalized likelihood models / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2964248738 / rank
 
Normal rank

Latest revision as of 10:24, 30 July 2024

scientific article
Language Label Description Also known as
English
I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
scientific article

    Statements

    I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    18 May 2018
    0 references
    This paper proposes a general computational framework for solving nonconvex optimisation problems such as the penalized M-estimator \(\mathrm{argmin}_{\beta\in{\mathbb R}^d}\{ {\mathcal L}(\beta) + {\mathcal R}_{\lambda}(\beta)\}\), where \({\mathcal L}(\beta)\) is a smooth loss function, \({\mathcal R}_{\lambda}(\beta)\) is a sparsity-inducing penalty with a regularization parameter \(\lambda\). The proposed strategy enables the simultaneous control of the algorithmic complexity and the statistical error when fitting high-dimensional models appearing in various problems including low rank matrix completion problems, high-dimensional graphical models and quantile regression.
    0 references
    algorithmic statistics
    0 references
    iteration complexity
    0 references
    local adaptive MM
    0 references
    nonconvex statistical optimization
    0 references
    optimal rate of convergence
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references