A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization (Q393748): Difference between revisions
From MaRDI portal
Changed an Item |
Set profile property. |
||
Property / MaRDI profile type | |||
Property / MaRDI profile type: MaRDI publication profile / rank | |||
Normal rank |
Revision as of 01:08, 5 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization |
scientific article |
Statements
A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization (English)
0 references
24 January 2014
0 references
An algorithm combining the gradient sampling (GS) technique with the sequential quadratic programming (SQP) method for nonconvex, nonsmooth constrained optimization problems with locally Lipschitz and continuously differentiable functions is presented. The proposed algorithm generates a sequence of feasible iterations and guarantees that the objective function is monotonically decreasing. It is an alternative of the penalty function based SQP-GS algorithm proposed by \textit{F. E. Curtis} and \textit{M. L. Overton} [SIAM J. Optim. 22, No. 2, 474--500 (2012; Zbl 1246.49031)]. Instead of the penalty function, serving as a merit function to generate the next iterate, the authors make use of the improvement function, which is one of the most effective tools to handle constraints and plays a significant role in global convergence analysis.
0 references
constrained optimization
0 references
nonsmooth optimization
0 references
nonconvex optimization
0 references
nonlinear programming
0 references
gradient sampling
0 references
sequential quadratic programming
0 references
feasible algorithm
0 references
global convergence
0 references
Clarke subdifferential
0 references
numerical experiments
0 references