A new super-memory gradient method with curve search rule (Q2571993): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Differential gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A curvilinear optimization method based upon iterative estimation of the eigensystem of the Hessian matrix / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class of methods for unconstrained minimization based on stable numerical integration techniques / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Newton-type curvilinear search method for optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stepsize analysis for descent methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence properties of the Beale-Powell restart algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Conjugate Directions without Linear Searches / rank
 
Normal rank
Property / cites work
 
Property / cites work: Solving Convex Programs by Means of Ordinary Differential Equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Using function-values in multi-step quasi-Newton methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum curvature multistep quasi-Newton methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Multi-step quasi-Newton methods for optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A globally convergent version of the Polak-Ribière conjugate gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class on nonmonotone stabilization methods in unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3671858 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Quadratically convergent algorithms and one-dimensional search schemes / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new arc algorithm for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Arc Method for Nonlinear Programming / rank
 
Normal rank
Property / cites work
 
Property / cites work: Numerical Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A gradient-related algorithm with inexact line searches / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new descent algorithm with curve search rule / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4928358 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new and dynamic method for unconstrained minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A note on minimization problems and multistep methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Differential optimization techniques / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class of gradient unconstrained minimization algorithms with adaptive stepsize / rank
 
Normal rank
Property / cites work
 
Property / cites work: Note on global convergence of ODE methods for unconstrained optimization / rank
 
Normal rank

Latest revision as of 11:48, 11 June 2024

scientific article
Language Label Description Also known as
English
A new super-memory gradient method with curve search rule
scientific article

    Statements

    A new super-memory gradient method with curve search rule (English)
    0 references
    0 references
    0 references
    14 November 2005
    0 references
    An unconstrained minimization problem of the form \[ \text{minimize }f(x),\quad x\in\mathbb{R}^n \] is considered. It is assumed that \(f\) is a continuously differentiable function satisfying the following conditions: (i) \(f\) has a lower bound on the level set \(L_0= \{x\in\mathbb{R}^n\mid f(x)\leq f(x_0)\}\), where \(x_0\) is given; (ii) the gradient of \(f\) is uniformly continuous in an open convex set \(B\) that contains \(L_0\); (iii) the gradient of \(f\) is Lipschitz continuous in \(B\). A new super-memory gradient method with curve search rule for solving the unconstraint minimization problem under the above assumptions is described. The method uses previous multi-step iterative information and curve search rule to generate new iterative points at each iteration. The suggested method has global convergence and linear convergence rate. Numerical experience with the method presented at the end of the paper shows a good computational effectiveness in practical computations.
    0 references
    unconstrained minimization
    0 references
    gradient type methods
    0 references
    curve search rule
    0 references
    convergence numerical examples
    0 references
    0 references
    0 references
    0 references

    Identifiers