A new super-memory gradient method with curve search rule (Q2571993): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.amc.2004.10.063 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2088057602 / rank
 
Normal rank

Revision as of 18:18, 19 March 2024

scientific article
Language Label Description Also known as
English
A new super-memory gradient method with curve search rule
scientific article

    Statements

    A new super-memory gradient method with curve search rule (English)
    0 references
    0 references
    0 references
    14 November 2005
    0 references
    An unconstrained minimization problem of the form \[ \text{minimize }f(x),\quad x\in\mathbb{R}^n \] is considered. It is assumed that \(f\) is a continuously differentiable function satisfying the following conditions: (i) \(f\) has a lower bound on the level set \(L_0= \{x\in\mathbb{R}^n\mid f(x)\leq f(x_0)\}\), where \(x_0\) is given; (ii) the gradient of \(f\) is uniformly continuous in an open convex set \(B\) that contains \(L_0\); (iii) the gradient of \(f\) is Lipschitz continuous in \(B\). A new super-memory gradient method with curve search rule for solving the unconstraint minimization problem under the above assumptions is described. The method uses previous multi-step iterative information and curve search rule to generate new iterative points at each iteration. The suggested method has global convergence and linear convergence rate. Numerical experience with the method presented at the end of the paper shows a good computational effectiveness in practical computations.
    0 references
    unconstrained minimization
    0 references
    gradient type methods
    0 references
    curve search rule
    0 references
    convergence numerical examples
    0 references

    Identifiers