Nonsmooth equations in optimization. Regularity, calculus, methods and applications (Q1348520): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Removed claim: reviewed by (P1447): Item:Q588442
RedirectionBot (talk | contribs)
Changed an Item
Property / reviewed by
 
Property / reviewed by: Joseph Frédéric Bonnans / rank
 
Normal rank

Revision as of 18:28, 19 February 2024

scientific article
Language Label Description Also known as
English
Nonsmooth equations in optimization. Regularity, calculus, methods and applications
scientific article

    Statements

    Nonsmooth equations in optimization. Regularity, calculus, methods and applications (English)
    0 references
    0 references
    0 references
    13 May 2002
    0 references
    The book deals with the theory of first and second order optimality conditions for nonlinear programming problems (NLP), as well as with the perturbation analysis of value function and solutions for such problems. The difficulty here lies in the fact that the cost function and constraints are not assumed to be twice continuously differentiable. The first chapters emphazise (strong) regularity of a multifunction, in the sense that the inverse mapping is locally (univalued) pseudo-Lipschitz. Regularity of the constraints, through an argument based on an exact penalty function, allows to obtain Lagrangre multipliers associated with local solutions. The book discusses various characterizations of regularity based on contingent and generalized derivatives, as well as Thibault's limit set. The relations between these derivatives, chain rules and mean value theorems are presented. Another important parts of the book are the study of Kojima functions, which are a re-writing of optimality systems of NLP, assuming data to be \(C^{1,1}\) (they have Lipschitz derivatives), and the study on perturbation of NLP. The last chapters discuss Newton's method for Lipschitz equations and their application to perturbed Kojima systems. Some results are presented in a Banach space setting, and some others such as the one on perturbed NLP in a finite dimensional setting. Probably due to space limitation, no link is discussed with semi infinite programming, which can be reformulated as a nonsmooth NLP. I found the book quite useful. The literature on this subject is, as the subject itself, quite technical and not easy to read. This book is a very convenient guide to a selection of the most important properties, in a field where there are still many discovers to be done.
    0 references
    nonlinear programming
    0 references
    generalized derivatives
    0 references
    Kojima functions
    0 references
    parametric optimization
    0 references
    Newton's method
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references