A class of superlinearly convergent projection algorithms with relaxed stepsizes (Q1057626): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Removed claim: author (P16): Item:Q181221
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / author
 
Property / author: Berc Rustem / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: A feasible direction algorithm for convex optimization: Global convergence rates / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Goldstein-Levitin-Polyak gradient projection method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Projected Newton Methods for Optimization Problems with Simple Constraints / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Local Convergence of Quasi-Newton Methods for Constrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Quasi-Newton Methods, Motivation and Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular Extremals / rank
 
Normal rank
Property / cites work
 
Property / cites work: Newton’s Method and the Goldstein Step-Length Rule for Constrained Minimization Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global and Asymptotic Convergence Rate Estimates for a Class of Projected Gradient Processes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5588268 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convex programming in Hilbert space / rank
 
Normal rank
Property / cites work
 
Property / cites work: Superlinearly convergent variable metric algorithms for general nonlinear programming problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5566063 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Gradient Projection Method under Mild Differentiability Conditions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5672476 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Projection methods in constrained optimisation and applications to optimal policy decisions / rank
 
Normal rank

Latest revision as of 17:34, 14 June 2024

scientific article
Language Label Description Also known as
English
A class of superlinearly convergent projection algorithms with relaxed stepsizes
scientific article

    Statements

    A class of superlinearly convergent projection algorithms with relaxed stepsizes (English)
    0 references
    0 references
    1984
    0 references
    Let \(f: {\mathbb{R}}^ n\to {\mathbb{R}}\) and \(g: {\mathbb{R}}^ n\to {\mathbb{R}}^ m\) be differentiable functions and let \(D=\{x\in {\mathbb{R}}^ n| g(x)\geq 0\}\) be a convex set. Consider the problem: min\(\{\) f(x)\(| x\in D\}\). This typical constrained optimization problem is considered by the author, who investigates, in this work, a quasi-Newton extension of the Goldstein- Levitin-Polyak projected gradient algorithm. This extension projects an unconstrained descent step on to D, the feasible region and is closed to the work of \textit{J. C. Allwright} [J. Optimization Theory Appl. 30, 1-18 (1980; Zbl 0393.90069)]. In the algorithm presented here, the determination of the stepsize is divided into two stages. The first one determines the lengths of the unconstrained step and the second stage is the determination of the stepsize from the range [0,1] that shortens the projected step, if the projection on D does not reduce the objective function f. In the first stage, the objective function f decreases and is bounded by a conventional linear functional, while the second stage uses a quadratic functional as a bound. We remark that the well known Goldstein-Levitin-Polyak algorithm is relaxed here and so the quadratic subproblem involved, becomes simple.
    0 references
    0 references
    0 references
    relaxed stepsizes
    0 references
    superlinear convergence
    0 references
    constrained optimization
    0 references
    quasi-Newton
    0 references
    Goldstein-Levitin-Polyak projected gradient algorithm
    0 references
    quadratic subproblem
    0 references
    0 references