Convergence analysis of a class of nonlinear penalization methods for constrained optimization via first-order necessary optimality conditions
From MaRDI portal
Publication:1411462
DOI10.1023/A:1022503820909zbMath1045.90063OpenAlexW219301653MaRDI QIDQ1411462
Publication date: 29 October 2003
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/a:1022503820909
necessary optimality conditionsdifferentiabilitylocally Lipschitz functionsNonlinear penalizationsmooth approximate variational principle
Related Items (6)
Convergence analysis of a class of penalty methods for vector optimization problems with cone constraints ⋮ Lower-order smoothed objective penalty functions based on filling properties for constrained optimization problems ⋮ An interior-point \(\ell_{\frac{1}{2}}\)-penalty method for inequality constrained nonlinear optimization ⋮ Second-Order Smoothing Objective Penalty Function for Constrained Optimization Problems ⋮ An exact lower order penalty function and its smoothing in nonlinear programming ⋮ Smoothing of the lower-order exact penalty function for inequality constrained optimization
Cites Work
- Unnamed Item
- Convex composite multi-objective nonsmooth programming
- Second-order global optimality conditions for convex composite optimization
- An exterior point method for computing points that satisfy second-order necessary conditions for a \(C^{1,1}\) optimization problem
- First and second-order optimality conditions for convex composite multiobjective optimization
- Proximal analysis and minimization principles
- Optimization and nonsmooth analysis
- A Smooth Variational Principle With Applications to Subdifferentiability and to Differentiability of Convex Functions
- First and second order conditions for a class of nondifferentiable optimization problems
- On conditions for optimality of the nonlinearl 1 problem
- Optimality conditions for piecewise smooth functions
- Necessary and sufficient optimality conditions for a class of nonsmooth minimization problems
- Penalty methods for computing points that satisfy second order necessary conditions
- Composite Nonsmooth Programming with Gâteaux Differentiability
- Variational Analysis
- Decreasing Functions with Applications to Penalization
- Extended Lagrange And Penalty Functions in Continuous Optimization*
This page was built for publication: Convergence analysis of a class of nonlinear penalization methods for constrained optimization via first-order necessary optimality conditions