A new class of supermemory gradient methods
From MaRDI portal
Publication:865511
DOI10.1016/j.amc.2006.05.079zbMath1111.65055OpenAlexW2104668044MaRDI QIDQ865511
Publication date: 19 February 2007
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2006.05.079
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convergence of quasi-Newton method with new inexact line search
- New inexact line search method for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Supermemory descent methods for unconstrained minimization
- Line search termination criteria for collinear scaling algorithms for minimizing a class of convex functions
- Global convergence result for conjugate gradient methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Matrix algebras in quasi-Newton methods for unconstrained minimization
- Updating the self-scaling symmetric rank one algorithm with limited memory for large-scale unconstrained optimization
- Limited memory quasi-Newton method for large-scale linearly equality-constrained minimization
- A new descent algorithm with curve search rule
- A limited-memory multipoint symmetric secant method for bound constrained optimization
- Global convergence of the method of shortest residuals
- Improved Hessian approximations for the limited memory BFGS method
- A class on nonmonotone stabilization methods in unconstrained optimization
- Memory gradient method for the minimization of functions
- Relation between the memory gradient method and the Fletcher-Reeves method
- Study on a supermemory gradient method for the minimization of functions
- Quadratically convergent algorithms and one-dimensional search schemes
- A new super-memory gradient method with curve search rule
- Shifted limited-memory variable metric methods for large-scale unconstrained optimization
- Testing Unconstrained Optimization Software
- A Numerical Study of the Limited Memory BFGS Method and the Truncated-Newton Method for Large Scale Optimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- Limited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization
- Automatic Preconditioning by Limited Memory Quasi-Newton Updating
- Convergence of multi-step curve search method for unconstrained optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- Conjugate Directions without Linear Searches
- Convergence of Newton's method for convex best interpolation
- New properties of a nonlinear conjugate gradient method