The space decomposition theory for a class of eigenvalue optimizations (Q457214)

From MaRDI portal
scientific article
Language Label Description Also known as
English
The space decomposition theory for a class of eigenvalue optimizations
scientific article

    Statements

    The space decomposition theory for a class of eigenvalue optimizations (English)
    0 references
    0 references
    0 references
    0 references
    26 September 2014
    0 references
    The goal of the paper is to develop a framework for a second-order analysis for a class of eigenvalue optimization problems in which the eigenvalues are considered as functions of a symmetric matrix and each involved arbitrary eigenvalue function is a D.C. function, i.e., the difference of two convex functions. For this purpose, the \({\mathcal{U}}\)-Lagrangian theory is applied to such a D.C. function and, in particular, the first- and second-order derivatives of the \({\mathcal{U}}\)-Lagrangian in the space of decision variables are derived under the assumption that a transversality condition is satisfied. Moreover, a conceptual algorithm is presented and quadratic convergence is proved for that. As an application, the problem is studied to best approximate a given symmetric matrix by a low rank symmetric positive semidefinite matrix.
    0 references
    0 references
    nonsmooth optimization
    0 references
    eigenvalue optimization
    0 references
    \({\mathcal{VU}}\)-decomposition
    0 references
    \({\mathcal{U}}\)-Lagrangian
    0 references
    D.C. function
    0 references
    second-order derivative
    0 references
    low rank matrix approximation
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references