A new class of supermemory gradient methods (Q865511)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A new class of supermemory gradient methods |
scientific article |
Statements
A new class of supermemory gradient methods (English)
0 references
19 February 2007
0 references
An unconstrained optimization problem consisting in minimizing a continuously differentiable function \(f:\mathbb R^n\to \mathbb R^1\) is considered. A new class of supermemory gradient methods for the unconstrained minimization problem is proposed. Conditions, which guarantee global convergence are presented. In a sufficiently small neighborhood of the optimal solution the algorithms can be reduced to quasi-Newton methods. Unlike to many line search methods, the proposed algorithms use not only the current iterative information, but also some information, which can be obtained from preceding iterations. Numerical results reported in the concluding part of the paper show the effectiveness of the proposed algorithms in practical computation.
0 references
unconstrained optimization
0 references
global convergence
0 references
quasi-Newton methods
0 references
algorithms
0 references
numerical results
0 references
0 references
0 references
0 references
0 references
0 references
0 references