Second-order conditions for existence of augmented Lagrange multipliers for eigenvalue composite optimization problems
DOI10.1007/S10898-015-0273-8zbMATH Open1357.90119OpenAlexW2147456049MaRDI QIDQ496593FDOQ496593
Publication date: 22 September 2015
Published in: Journal of Global Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10898-015-0273-8
Recommendations
- Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices
- The space decomposition theory for a class of eigenvalue optimizations
- Augmented Lagrangian duality for composite optimization problems
- A space decomposition scheme for maximum eigenvalue functions and its applications
- The $\U$-Lagrangian of the Maximum Eigenvalue Function
augmented Lagrange multiplierMoreau envelopeeigenvalue composite optimization problemssecond-order epi-derivative
Optimality conditions and duality in mathematical programming (90C46) Nonconvex programming, global optimization (90C26)
Cites Work
- Variational Analysis
- Convex analysis and monotone operator theory in Hilbert spaces
- Title not available (Why is that?)
- Title not available (Why is that?)
- Signal Recovery by Proximal Forward-Backward Splitting
- Lagrange Multipliers and Optimality
- Title not available (Why is that?)
- Prox-regular functions in variational analysis
- Augmented Lagrange Multiplier Functions and Duality in Nonconvex Programming
- Sensitivity analysis of all eigenvalues of a symmetric matrix
- Twice differentiable spectral functions
- Derivatives of Spectral Functions
- Convex spectral functions
- Second-Order Optimality Conditions in Nonlinear Programming Obtained by Way of Epi-Derivatives
- First- and Second-Order Epi-Differentiability in Nonlinear Programming
- Title not available (Why is that?)
- Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices
- Perturbation theory of nonlinear programs when the set of optimal solutions is not a singleton
- Second-order directional derivatives of all eigenvalues of a symmetric matrix
- Local convergence of exact and inexact augmented Lagrangian methods under the second-order sufficient optimality condition
- On Minimizing the Maximum Eigenvalue of a Symmetric Matrix
- On Eigenvalue Optimization
- Semi-Definite Matrix Constraints in Optimization
- Large-Scale Optimization of Eigenvalues
- Some Properties of the Augmented Lagrangian in Cone Constrained Optimization
- First- and second-order epi-differentiability in eigenvalue optimization
- Generalized Second Derivatives of Convex Functions and Saddle Functions
- Generalized Second-Order Derivatives of Convex Functions in Reflexive Banach Spaces
- Generalized Hessian Properties of Regularized Nonsmooth Functions
- An Extremum Property of Sums of Eigenvalues
- Convergence analysis of the augmented Lagrangian method for nonlinear second-order cone optimization problems
Cited In (4)
- Existence of augmented Lagrange multipliers: reduction to exact penalty functions and localization principle
- Second-Order Conditions for the Existence of Augmented Lagrange Multipliers for Sparse Optimization
- Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices
- The space decomposition method for the sum of nonlinear convex maximum eigenvalues and its applications
This page was built for publication: Second-order conditions for existence of augmented Lagrange multipliers for eigenvalue composite optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q496593)