A successive quadratic programming method for a class of constrained nonsmooth optimization problems (Q1174457): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Set OpenAlex properties.
 
(3 intermediate revisions by 3 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3911679 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Numerical methods for nondifferentiable convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3690580 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization and nonsmooth analysis / rank
 
Normal rank
Property / cites work
 
Property / cites work: Strong uniqueness: A far-reaching criterion for the convergence analysis of iterative procedures / rank
 
Normal rank
Property / cites work
 
Property / cites work: Truncated-Newton algorithms for large-scale unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3928936 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A descent algorithm for nonsmooth convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A generalized proximal point algorithm for certain non-convex minimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Limiting behaviour of the approximate first order and second order directional derivatives for a convex function / rank
 
Normal rank
Property / cites work
 
Property / cites work: Calculus Rules on the Approximate Second-Order Directional Derivative of a Convex Function / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Exact Penalty Function Algorithm for Non-smooth Convex Constrained Minimization Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: A method for minimizing the sum of a convex function and a continuously differentiable function / rank
 
Normal rank
Property / cites work
 
Property / cites work: A constraint linearization method for nondifferentiable convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some remarks on the construction of higher order algorithms in convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A minimization method for the sum of a convex function and a continuously differentiable function / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Extension of Constrained Optimization Algorithms from Differentiable to Nondifferentiable Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convex Analysis / rank
 
Normal rank
Property / cites work
 
Property / cites work: Monotone Operators and the Proximal Point Algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3968756 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Submonotone mappings and the proximal point algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: ε-Optimal solutions in nondifferentiable convex programming and some related questions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimality conditions for piecewise smooth functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Local properties of algorithms for minimizing nonsmooth composite functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: An algorithm for composite nonsmooth optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Efficient Trust Region Algorithm for Minimizing Nondifferentiable Composite Functions / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/bf01588789 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2049068112 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 08:22, 30 July 2024

scientific article
Language Label Description Also known as
English
A successive quadratic programming method for a class of constrained nonsmooth optimization problems
scientific article

    Statements

    A successive quadratic programming method for a class of constrained nonsmooth optimization problems (English)
    0 references
    0 references
    25 June 1992
    0 references
    This paper is a model of clarity and completeness. It describes an iterative procedure for approximating Kuhn-Tucker points in the following problem: \[ \hbox{minimize\;}f(x)+F(x)\hbox{ subject to\;}c_ i(x)=0,\;\;i=1,2,\dots,m.\leqno(P) \] Here \(F\) and \(c_ i\), \(i=1,2,\dots,m\), are twice-continuously differentiable functions carrying \(\mathbb{R}^ n\) into \(\mathbb{R}\), while \(f: \mathbb{R}^ n\to\mathbb{R}\) is a (not necessarily differentiable) convex function. Given positive tolerances \(\varepsilon_ 0\), \(\varepsilon_ 1\), and \(\varepsilon_ 2\), the algorithm takes finitely many steps to identify a point \(x\) in \(\mathbb{R}^ n\) and a multiplier vector u in \(\mathbb{R}^ m\) satisfying the following approximate Kuhn-Tucker conditions: \[ 0\in\partial_{\varepsilon_ 0}f(x)+\nabla F(x)+Dc(x)^ T u+\varepsilon_ 1 S^ n,\;\;0\in c(x)+\varepsilon_ 2 S^ m. \] Here \(S^ k\) denotes \(\{w\in\mathbb{R}^ k: | w|=1\}\). Starting from a point \(x_ 0\) in \(\mathbb{R}^ n\), the algorithm starts by formulating a linearly constrained, convex programming problem whose solution gives a promising search direction based at \(x_ 0\). This auxiliary problem is solved approximately using an iterative cutting-plane type approach. A penalty parameter used during line search is updated during this phase. Once the search direction has been found, and the penalty parameter identified, line search at a geometric rate is used to generate an improved approximation \(x_ 1\). Then the whole procedure begins again. The paper explains the details of each step, and provides carefully stated theorems with detailed proofs to assure termination after finitely many steps. The rate of convergence is not discussed explicitly, and there are no computational examples. However, the author claims that the algorithm is ``implementable'' in the sense that instead of requiring complete knowledge of the subgradients of the nonsmooth convex function \(f\), the procedure requires only one subgradient vector at each step.
    0 references
    constrained nonsmooth optimization
    0 references
    exact penalization
    0 references
    approximating Kuhn-Tucker points
    0 references
    iterative cutting-plane type approach
    0 references
    line search
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references