Enhanced Karush-Kuhn-Tucker condition and weaker constraint qualifications
DOI10.1007/S10107-013-0667-7zbMATH Open1285.90078OpenAlexW2041718604MaRDI QIDQ353154FDOQ353154
Authors: Jane J. Ye, Jin Zhang
Publication date: 12 July 2013
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-013-0667-7
Recommendations
- Second-order enhanced optimality conditions and constraint qualifications
- Improved enhanced Fritz John condition and constraints qualifications using convexificators
- Enhanced Karush-Kuhn-Tucker conditions for mathematical programs with equilibrium constraints
- Convexificators and boundedness of the Kuhn-Tucker multipliers set
- Mathematical programs with geometric constraints in Banach spaces: enhanced optimality, exact penalty, and sensitivity
nonsmooth analysisvalue functionlimiting subdifferentialcalmnessconstraint qualificationcomplementary violation conditionenhanced Fritz John conditionenhanced KKT conditionLjusternik theoremlocal error boundpseudonormalquasinormal
Optimality conditions and duality in mathematical programming (90C46) Nonlinear programming (90C30) Sensitivity, stability, parametric optimization (90C31) Nonsmooth analysis (49J52)
Cites Work
- Variational Analysis
- Techniques of variational analysis
- The Fritz John necessary optimality conditions in the presence of equality and inequality constraints
- Title not available (Why is that?)
- Optimization and nonsmooth analysis
- Title not available (Why is that?)
- On the Calmness of a Class of Multifunctions
- Title not available (Why is that?)
- Subgradients of marginal functions in parametric mathematical programming
- Sensitivity analysis of the value function for optimization problems with variational inequality constraints
- Title not available (Why is that?)
- Title not available (Why is that?)
- The Lagrange Multiplier Rule
- Title not available (Why is that?)
- Convex analysis and nonlinear optimization. Theory and examples
- A New Approach to Lagrange Multipliers
- Fréchet subdifferential calculus and optimality conditions in nondifferentiable programming
- Title not available (Why is that?)
- Constraint qualifications and Lagrange multipliers in nondifferentiable programming problems
- Sufficient conditions for error bounds
- Necessary Optimality Conditions for Optimization Problems with Variational Inequality Constraints
- Pseudonormality and a Lagrange multiplier theory for constrained optimization
- Mathematical Programs with Equilibrium Constraints: Enhanced Fritz John-conditions, New Constraint Qualifications, and Improved Exact Penalty Results
- Title not available (Why is that?)
- The relation between pseudonormality and quasiregularity in constrained optimization
- Lagrange Multipliers for Nonconvex Generalized Gradients with Equality, Inequality, and Set Constraints
- On error bounds for quasinormal programs
- Enhanced Fritz John Conditions for Convex Programming
Cited In (23)
- Directional Quasi-/Pseudo-Normality as Sufficient Conditions for Metric Subregularity
- Constraint qualifications in terms of convexificators for nonsmooth programming problems with mixed constraints
- New Constraint Qualifications for S-Stationarity for MPEC with Nonsmooth Objective
- Enhanced Fritz John stationarity, new constraint qualifications and local error bound for mathematical programs with vanishing constraints
- Constraint qualifications for mathematical programs with equilibrium constraints and their local preservation property
- Second-order enhanced optimality conditions and constraint qualifications
- A Sequential Optimality Condition Related to the Quasi-normality Constraint Qualification and Its Algorithmic Consequences
- New results on constraint qualifications for nonlinear extremum problems and extensions
- Karush-Kuhn-Tucker conditions and Lagrangian approach for improving machine learning techniques: a survey and new developments
- Enhanced Karush-Kuhn-Tucker conditions for mathematical programs with equilibrium constraints
- Necessary optimality conditions and exact penalization for non-Lipschitz nonlinear programs
- On enhanced KKT optimality conditions for smooth nonlinear optimization
- A symmetric Gauss-Seidel based method for a class of multi-period mean-variance portfolio selection problems
- Saddle point approximation approaches for two-stage robust optimization problems
- Constraint qualifications for nonsmooth programming
- Multiobjective Problems: Enhanced Necessary Conditions and New Constraint Qualifications through Convexificators
- Characterization of generalized FJ and KKT conditions in nonsmooth nonconvex optimization
- On scaled stopping criteria for a safeguarded augmented Lagrangian method with theoretical guarantees
- Characterizing FJ and KKT Conditions in Nonconvex Mathematical Programming with Applications
- Strong duality and KKT conditions in nonconvex optimization with a single equality constraint and geometric constraint
- On an \(l_1\) exact penalty result for mathematical programs with vanishing constraints
- Title not available (Why is that?)
- Title not available (Why is that?)
This page was built for publication: Enhanced Karush-Kuhn-Tucker condition and weaker constraint qualifications
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q353154)