Necessary and sufficient KKT optimality conditions in non-convex optimization (Q2361126)

From MaRDI portal
Revision as of 14:28, 3 August 2023 by Importer (talk | contribs) (‎Created a new Item)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
Necessary and sufficient KKT optimality conditions in non-convex optimization
scientific article

    Statements

    Necessary and sufficient KKT optimality conditions in non-convex optimization (English)
    0 references
    0 references
    29 June 2017
    0 references
    The author provides necessary and sufficient Karush-Kuhn-Tucker (KKT) optimality conditions for the non-convex optimization problem: \[ \min_x\{f(x): g_j(x)\leq 0, \,j=1,2,\dots, m\},\quad x\in \mathbb{R}^n, \] where \(f,g_j:\mathbb{R}^n\to \mathbb{R}\), \(j=1,2,\dots, m\). The author proves that if Slater's condition and a non-degeneracy assumption are satisfied together with a suitable condition of the feasible set, then, without any convexity of the functions involved as well of the feasible set, the KKT optimality conditions are necessary. Furthermore, the non-trivial KKT optimality conditions are globally sufficient for the point \(x\) to be a global minimizer, provided that the strict level set \(L^<_f(x)\) of the objective function is convex. In particular, if \(L_f(x)\) is strictly convex at \(x\), then \(x\) is a strict global minimizer for the optimization problem.
    0 references
    0 references
    convex optimization
    0 references
    Slater's condition
    0 references
    level set
    0 references
    Karush-Kuhn-Tucker optimality conditions
    0 references

    Identifiers