Convergence of supermemory gradient method
From MaRDI portal
Publication:2454990
DOI10.1007/BF02832325zbMath1149.90153MaRDI QIDQ2454990
Publication date: 22 October 2007
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (2)
A new supermemory gradient method for unconstrained optimization problems ⋮ SABRINA: a stochastic subspace majorization-minimization algorithm
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Supermemory descent methods for unconstrained minimization
- Global convergence result for conjugate gradient methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A new descent algorithm with curve search rule
- Memory gradient method for the minimization of functions
- Relation between the memory gradient method and the Fletcher-Reeves method
- Study on a supermemory gradient method for the minimization of functions
- Quadratically convergent algorithms and one-dimensional search schemes
- A new super-memory gradient method with curve search rule
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Testing a Class of Methods for Solving Minimization Problems with Simple Bounds on the Variables
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence of multi-step curve search method for unconstrained optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- On the Barzilai and Borwein choice of steplength for the gradient method
- Function minimization by conjugate gradients
- Conjugate Directions without Linear Searches
This page was built for publication: Convergence of supermemory gradient method