Numerical optimization of eigenvalues of Hermitian matrix functions
From MaRDI portal
Publication:2923367
Abstract: This work concerns the global minimization of a prescribed eigenvalue or a weighted sum of prescribed eigenvalues of a Hermitian matrix-valued function depending on its parameters analytically in a box. We describe how the analytical properties of eigenvalue functions can be put into use to derive piece-wise quadratic functions that underestimate the eigenvalue functions. These piece-wise quadratic under-estimators lead us to a global minimization algorithm, originally due to Breiman and Cutler. We prove the global convergence of the algorithm, and show that it can be effectively used for the minimization of extreme eigenvalues, e.g., the largest eigenvalue or the sum of the largest specified number of eigenvalues. This is particularly facilitated by the analytical formulas for the first derivatives of eigenvalues, as well as analytical lower bounds on the second derivatives that can be deduced for extreme eigenvalue functions. The applications that we have in mind also include the -norm of a linear dynamical system, numerical radius, distance to uncontrollability and various other non-convex eigenvalue optimization problems, for which, generically, the eigenvalue function involved is simple at all points.
Recommendations
- scientific article; zbMATH DE number 4096706
- A Support Function Based Algorithm for Optimization with Eigenvalue Constraints
- On minimizing the largest eigenvalue of a symmetric matrix
- Large-Scale Optimization of Eigenvalues
- A quadratically convergent local algorithm on minimizing sums of the largest eigenvalues of a symmetric matrix
Cited in
(27)- Variational characterization and Rayleigh quotient iteration of 2D eigenvalue problem with applications
- Black-box learning of multigrid parameters
- Model Order Reduction in Contour Integral Methods for Parametric PDEs
- Matrix polynomials with specified eigenvalues
- A subspace framework for \(\mathcal{H}_\infty \)-norm minimization
- Nonsmooth optimization method for \(H_\infty\) output feedback control
- A Support Function Based Algorithm for Optimization with Eigenvalue Constraints
- Model order reduction for delay systems by iterative interpolation
- A subspace method for large-scale eigenvalue optimization
- Subspace acceleration for the Crawford number and related eigenvalue optimization problems
- Large-Scale Optimization of Eigenvalues
- Large-scale computation of \(\mathcal{L}_\infty\)-norms by a greedy subspace method
- Spectrally constrained optimization
- Generalized derivatives of eigenvalues of a symmetric matrix
- Inheritance properties of Krylov subspace methods for continuous-time algebraic Riccati equations
- Robust stability optimization for linear delay systems in a probabilistic framework
- Generating eigenvalue bounds using optimization
- Krylov subspace methods for discrete-time algebraic Riccati equations
- Differential equations for real-structured defectivity measures
- Large-scale minimization of the pseudospectral abscissa
- Approximate residual-minimizing shift parameters for the low-rank ADI iteration
- Subspace acceleration for large-scale parameter-dependent Hermitian eigenproblems
- Approximation of stability radii for large-scale dissipative Hamiltonian systems
- Large-scale and global maximization of the distance to instability
- An unconstrained global optimization framework for real symmetric eigenvalue problems
- Nonlinear eigenvector methods for convex minimization over the numerical range
- Derivatives of symplectic eigenvalues and a Lidskii type theorem
This page was built for publication: Numerical optimization of eigenvalues of Hermitian matrix functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2923367)