I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error (Q1750288)

From MaRDI portal
scientific article
Language Label Description Also known as
English
I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
scientific article

    Statements

    I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    18 May 2018
    0 references
    This paper proposes a general computational framework for solving nonconvex optimisation problems such as the penalized M-estimator \(\mathrm{argmin}_{\beta\in{\mathbb R}^d}\{ {\mathcal L}(\beta) + {\mathcal R}_{\lambda}(\beta)\}\), where \({\mathcal L}(\beta)\) is a smooth loss function, \({\mathcal R}_{\lambda}(\beta)\) is a sparsity-inducing penalty with a regularization parameter \(\lambda\). The proposed strategy enables the simultaneous control of the algorithmic complexity and the statistical error when fitting high-dimensional models appearing in various problems including low rank matrix completion problems, high-dimensional graphical models and quantile regression.
    0 references
    0 references
    0 references
    0 references
    0 references
    algorithmic statistics
    0 references
    iteration complexity
    0 references
    local adaptive MM
    0 references
    nonconvex statistical optimization
    0 references
    optimal rate of convergence
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references