Second Derivatives for Optimizing Eigenvalues of Symmetric Matrices
From MaRDI portal
Publication:4842559
Recommendations
- Iterative computation of second‐order derivatives of eigenvalues and eigenvectors
- A quadratically convergent local algorithm on minimizing the largest eigenvalue of a symmetric matrix
- On Minimizing the Maximum Eigenvalue of a Symmetric Matrix
- On minimizing the largest eigenvalue of a symmetric matrix
- A quadratically convergent local algorithm on minimizing sums of the largest eigenvalues of a symmetric matrix
Cited in
(34)- Faster and more accurate computation of the \(\mathcal{H}_\infty\) norm via optimization
- Derivatives of functions of eigenvalues and eigenvectors for symmetric matrices
- Bilinear quadratures for inner products
- Spectral bundle methods for non-convex maximum eigenvalue functions: second-order methods
- The 𝒰-Lagrangian of a convex function
- Fast certifiable algorithm for the absolute pose estimation of a camera
- Variational characterization and Rayleigh quotient iteration of 2D eigenvalue problem with applications
- A sequential quadratic penalty method for nonlinear semidefinite programming
- A Lipschitzian error bound for convex quadratic symmetric cone programming
- A sequential quadratic penalty method for nonlinear semidefinite programming
- Computing the Kreiss constant of a matrix
- The space decomposition theory for a class of eigenvalue optimizations
- A fast space-decomposition scheme for nonconvex eigenvalue optimization
- Generalized derivatives of eigenvalues of a symmetric matrix
- Quadratic expansions of spectral functions
- Root-Max Problems, Hybrid Expansion-Contraction, and Quadratically Convergent Optimization of Passive Systems
- Special backtracking proximal bundle method for nonconvex maximum eigenvalue optimization
- Extended and improved criss-cross algorithms for computing the spectral value set abscissa and radius
- The spectral bundle method with second-order information
- Regularization using a parameterized trust region subproblem
- A hierarchy of spectral relaxations for polynomial optimization
- Lower-order penalization approach to nonlinear semidefinite programming
- Principal components: a descent algorithm
- \(\mathcal{UV}\)-theory of a class of semidefinite programming and its applications
- A second-order bundle method based on \(\mathcal{UV}\)-decomposition strategy for a special class of eigenvalue optimizations
- The space decomposition method for the sum of nonlinear convex maximum eigenvalues and its applications
- First- and second-order epi-differentiability in eigenvalue optimization
- A space decomposition scheme for maximum eigenvalue functions and its applications
- THE CENTRAL PATH IN SMOOTH CONVEX SEMIDEFINITE PROGRAMS
- Second-order directional derivatives of all eigenvalues of a symmetric matrix
- Smooth convex approximation to the maximum eigenvalue function
- Approximate augmented Lagrangian functions and nonlinear semidefinite programs
- A decomposition algorithm for the sums of the largest eigenvalues
- Spectral bundle methods for non-convex maximum eigenvalue functions: first-order methods
This page was built for publication: Second Derivatives for Optimizing Eigenvalues of Symmetric Matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4842559)