A new super-memory gradient method with curve search rule (Q2571993)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A new super-memory gradient method with curve search rule |
scientific article |
Statements
A new super-memory gradient method with curve search rule (English)
0 references
14 November 2005
0 references
An unconstrained minimization problem of the form \[ \text{minimize }f(x),\quad x\in\mathbb{R}^n \] is considered. It is assumed that \(f\) is a continuously differentiable function satisfying the following conditions: (i) \(f\) has a lower bound on the level set \(L_0= \{x\in\mathbb{R}^n\mid f(x)\leq f(x_0)\}\), where \(x_0\) is given; (ii) the gradient of \(f\) is uniformly continuous in an open convex set \(B\) that contains \(L_0\); (iii) the gradient of \(f\) is Lipschitz continuous in \(B\). A new super-memory gradient method with curve search rule for solving the unconstraint minimization problem under the above assumptions is described. The method uses previous multi-step iterative information and curve search rule to generate new iterative points at each iteration. The suggested method has global convergence and linear convergence rate. Numerical experience with the method presented at the end of the paper shows a good computational effectiveness in practical computations.
0 references
unconstrained minimization
0 references
gradient type methods
0 references
curve search rule
0 references
convergence numerical examples
0 references
0 references
0 references