A second-order optimality condition with first- and second-order complementarity associated with global convergence of algorithms
From MaRDI portal
Publication:1639726
DOI10.1007/S10589-018-0005-3zbMATH Open1391.90636OpenAlexW2796766994WikidataQ111288278 ScholiaQ111288278MaRDI QIDQ1639726FDOQ1639726
Authors: Gabriel Haeser
Publication date: 13 June 2018
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-018-0005-3
Recommendations
- A second-order sequential optimality condition associated to the convergence of optimization algorithms
- On second-order optimality conditions for nonlinear programming
- A new sequential optimality condition for constrained optimization and algorithmic consequences
- Local convergence of exact and inexact augmented Lagrangian methods under the second-order sufficient optimality condition
- Second-order global optimality conditions for convex composite optimization
Cites Work
- Title not available (Why is that?)
- Practical augmented Lagrangian methods for constrained optimization
- Title not available (Why is that?)
- Second-order negative-curvature methods for box-constrained and general constrained optimization
- Title not available (Why is that?)
- On Augmented Lagrangian Methods with General Lower-Level Constraints
- Title not available (Why is that?)
- Trust Region Methods
- Two new weak constraint qualifications and applications
- Title not available (Why is that?)
- Convergence to second-order stationary points in inequality constrained optimization
- Global Convergence of a Class of Trust Region Algorithms for Optimization with Simple Bounds
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization
- A note on upper Lipschitz stability, error bounds, and critical multipliers for Lipschitz-continuous KKT systems
- Interior-point \(\ell_2\)-penalty methods for nonlinear programming with strong global convergence properties
- A cone-continuity constraint qualification and algorithmic consequences
- On sequential optimality conditions for smooth constrained optimization
- A new sequential optimality condition for constrained optimization and algorithmic consequences
- Some theoretical limitations of second-order algorithms for smooth constrained optimization
- On second-order optimality conditions for nonlinear programming
- A primal-dual algorithm for nonlinear programming exploiting negative curvature directions
- On the Convergence Theory of Trust-Region-Based Algorithms for Equality-Constrained Optimization
- An augmented Lagrangian interior-point method using directions of negative curvature
- On affine scaling algorithms for nonconvex quadratic programming
- A stabilized SQP method: global convergence
- Corrigendum to: ``On the complexity of finding first-order critical points in constrained nonlinear optimization
- A new trust-region algorithm for equality constrained optimization
- On the complexity of approximating a KKT point of quadratic programming
- Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions
- Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points
- On the global convergence of interior-pointnonlinear programming algorithms
- Convergent Infeasible Interior-Point Trust-Region Methods for Constrained Minimization
- Convergence to Second-Order Stationary Points of a Primal-Dual Algorithm Model for Nonlinear Programming
- Erratum to: ``A second-order sequential optimality condition associated to the convergence of optimization algorithms
- Evaluation complexity for nonlinear constrained optimization using unscaled KKT conditions and high-order models
- Strict Constraint Qualifications and Sequential Optimality Conditions for Constrained Optimization
Cited In (14)
- Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points
- On the behavior of Lagrange multipliers in convex and nonconvex infeasible interior point methods
- On the weak second-order optimality condition for nonlinear semidefinite and second-order cone programming
- New sequential optimality conditions for mathematical programs with complementarity constraints and algorithmic consequences
- On optimality conditions for nonlinear conic programming
- Optimality conditions and global convergence for nonlinear semidefinite programming
- Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary
- Optimality conditions for nonlinear second-order cone programming and symmetric cone programming
- On the use of Jordan algebras for improving global convergence of an augmented Lagrangian method in nonlinear semidefinite programming
- An augmented Lagrangian algorithm for nonlinear semidefinite programming applied to the covering problem
- On the fulfillment of the complementary approximate Karush-Kuhn-Tucker conditions and algorithmic applications
- A second-order sequential optimality condition associated to the convergence of optimization algorithms
- Augmented Lagrangians quadratic growth and second-order sufficient optimality conditions
- On the approximate solutions of augmented subproblems within sequential methods for nonlinear programming
Uses Software
This page was built for publication: A second-order optimality condition with first- and second-order complementarity associated with global convergence of algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1639726)