Second Derivatives for Optimizing Eigenvalues of Symmetric Matrices
From MaRDI portal
Publication:4842559
DOI10.1137/S089547989324598XzbMath0832.65036MaRDI QIDQ4842559
Michael L. Overton, Robert S. Womersley
Publication date: 5 March 1996
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
nonsmooth optimizationquadratic programmingiterative methodNewton iterationlocal quadratic convergencemaximum eigenvaluemultiple eigenvaluessecond derivativeeigenprojectionsprescribed multiplicity
Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Numerical mathematical programming methods (65K05) Convex programming (90C25) Quadratic programming (90C20)
Related Items (31)
A Lipschitzian error bound for convex quadratic symmetric cone programming ⋮ THE CENTRAL PATH IN SMOOTH CONVEX SEMIDEFINITE PROGRAMS ⋮ Generalized derivatives of eigenvalues of a symmetric matrix ⋮ Bilinear Quadratures for Inner Products ⋮ Faster and More Accurate Computation of the $\mathcal{H}_\infty$ Norm via Optimization ⋮ A Second-Order Bundle Method Based on -Decomposition Strategy for a Special Class of Eigenvalue Optimizations ⋮ Principal components: a descent algorithm ⋮ The spectral bundle method with second-order information ⋮ Special backtracking proximal bundle method for nonconvex maximum eigenvalue optimization ⋮ A space decomposition scheme for maximum eigenvalue functions and its applications ⋮ A hierarchy of spectral relaxations for polynomial optimization ⋮ The space decomposition method for the sum of nonlinear convex maximum eigenvalues and its applications ⋮ Root-Max Problems, Hybrid Expansion-Contraction, and Quadratically Convergent Optimization of Passive Systems ⋮ The space decomposition theory for a class of eigenvalue optimizations ⋮ \(\mathcal{UV}\)-theory of a class of semidefinite programming and its applications ⋮ Regularization using a parameterized trust region subproblem ⋮ A fast space-decomposition scheme for nonconvex eigenvalue optimization ⋮ Derivatives of functions of eigenvalues and eigenvectors for symmetric matrices ⋮ Smooth convex approximation to the maximum eigenvalue function ⋮ Approximate augmented Lagrangian functions and nonlinear semidefinite programs ⋮ Quadratic expansions of spectral functions ⋮ Lower-order penalization approach to nonlinear semidefinite programming ⋮ Extended and Improved Criss-Cross Algorithms for Computing the Spectral Value Set Abscissa and Radius ⋮ The 𝒰-Lagrangian of a convex function ⋮ A Decomposition Algorithm for the Sums of the Largest Eigenvalues ⋮ A sequential quadratic penalty method for nonlinear semidefinite programming ⋮ A sequential quadratic penalty method for nonlinear semidefinite programming ⋮ First- and second-order epi-differentiability in eigenvalue optimization ⋮ Computing the Kreiss Constant of a Matrix ⋮ Spectral bundle methods for non-convex maximum eigenvalue functions: first-order methods ⋮ Spectral bundle methods for non-convex maximum eigenvalue functions: second-order methods
This page was built for publication: Second Derivatives for Optimizing Eigenvalues of Symmetric Matrices