Second Derivatives for Optimizing Eigenvalues of Symmetric Matrices
DOI10.1137/S089547989324598XzbMATH Open0832.65036MaRDI QIDQ4842559FDOQ4842559
Authors: Michael L. Overton, Robert S. Womersley
Publication date: 5 March 1996
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
Recommendations
- Iterative computation of secondâorder derivatives of eigenvalues and eigenvectors
- A quadratically convergent local algorithm on minimizing the largest eigenvalue of a symmetric matrix
- On Minimizing the Maximum Eigenvalue of a Symmetric Matrix
- On minimizing the largest eigenvalue of a symmetric matrix
- A quadratically convergent local algorithm on minimizing sums of the largest eigenvalues of a symmetric matrix
quadratic programmingnonsmooth optimizationiterative methodNewton iterationlocal quadratic convergencemaximum eigenvaluemultiple eigenvaluessecond derivativeeigenprojectionsprescribed multiplicity
Numerical mathematical programming methods (65K05) Quadratic programming (90C20) Convex programming (90C25) Numerical computation of eigenvalues and eigenvectors of matrices (65F15)
Cited In (34)
- Derivatives of functions of eigenvalues and eigenvectors for symmetric matrices
- Bilinear quadratures for inner products
- Fast certifiable algorithm for the absolute pose estimation of a camera
- Variational characterization and Rayleigh quotient iteration of 2D eigenvalue problem with applications
- The đ°-Lagrangian of a convex function
- Spectral bundle methods for non-convex maximum eigenvalue functions: second-order methods
- A sequential quadratic penalty method for nonlinear semidefinite programming
- Computing the Kreiss constant of a matrix
- A sequential quadratic penalty method for nonlinear semidefinite programming
- A Lipschitzian error bound for convex quadratic symmetric cone programming
- The space decomposition theory for a class of eigenvalue optimizations
- Quadratic expansions of spectral functions
- Generalized derivatives of eigenvalues of a symmetric matrix
- A fast space-decomposition scheme for nonconvex eigenvalue optimization
- Root-Max Problems, Hybrid Expansion-Contraction, and Quadratically Convergent Optimization of Passive Systems
- Extended and improved criss-cross algorithms for computing the spectral value set abscissa and radius
- Special backtracking proximal bundle method for nonconvex maximum eigenvalue optimization
- A hierarchy of spectral relaxations for polynomial optimization
- The spectral bundle method with second-order information
- Regularization using a parameterized trust region subproblem
- Lower-order penalization approach to nonlinear semidefinite programming
- \(\mathcal{UV}\)-theory of a class of semidefinite programming and its applications
- Principal components: a descent algorithm
- A second-order bundle method based on \(\mathcal{UV}\)-decomposition strategy for a special class of eigenvalue optimizations
- The space decomposition method for the sum of nonlinear convex maximum eigenvalues and its applications
- THE CENTRAL PATH IN SMOOTH CONVEX SEMIDEFINITE PROGRAMS
- First- and second-order epi-differentiability in eigenvalue optimization
- Second-order directional derivatives of all eigenvalues of a symmetric matrix
- A space decomposition scheme for maximum eigenvalue functions and its applications
- A decomposition algorithm for the sums of the largest eigenvalues
- Approximate augmented Lagrangian functions and nonlinear semidefinite programs
- Smooth convex approximation to the maximum eigenvalue function
- Spectral bundle methods for non-convex maximum eigenvalue functions: first-order methods
- Faster and more accurate computation of the \(\mathcal{H}_\infty\) norm via optimization
This page was built for publication: Second Derivatives for Optimizing Eigenvalues of Symmetric Matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4842559)