Global convergence of ADMM in nonconvex nonsmooth optimization (Q1736880)

From MaRDI portal
Revision as of 20:48, 28 February 2024 by SwMATHimport240215 (talk | contribs) (‎Changed an Item)
scientific article
Language Label Description Also known as
English
Global convergence of ADMM in nonconvex nonsmooth optimization
scientific article

    Statements

    Global convergence of ADMM in nonconvex nonsmooth optimization (English)
    0 references
    0 references
    0 references
    0 references
    26 March 2019
    0 references
    The authors consider the following optimization problem: \[ \left\{ \begin{array}{l}\underset{x_{0},\ldots ,x_{p},y}{\mathrm{min}}\mathrm{\quad }\phi (x_{0},\ldots ,x_{p},y) \\ \text{subject to }\, A_{0}x_{0}+A_{1}x_{1}+\ldots +A_{p}x_{p}+By=0 \end{array}\right. \] where \(\phi \) is a continuous (possibly nonconvex, nonsmooth) function, \(x_{i}\in \mathbb{R}^{n_{i}}\), \(y\in \mathbb{R}^{q}\) and \(A_{i}\), \(B\) are \(m\times n_{i}\) (resp. \(m\times q\)) matrices. The authors propose an ADMM (Alternating Direction Method of Multipliers) type algorithm (multi-block version) that updates sucessively each of the primal variables \(x_{0}, \dots, x_{p}, y\) followed by an update of the dual variable. Under certain assumptions they establish global convergence. This approach applies to problems arising in matrix decomposition, sparse recovery, machine learning and optimization on compact smooth manifolds, covering cases that were not previously covered by any convergence theory. They also discuss relations with the ADL (Augmented Lagrangian Method) and provide an example in which ADMM converges but the latter diverges.
    0 references
    nonconvex optimization
    0 references
    alternating direction method of multipliers
    0 references
    augmented Lagrangian method
    0 references
    block coordinate descent
    0 references
    sparse optimization
    0 references
    0 references
    0 references
    0 references

    Identifiers