A new descent algorithm with curve search rule
From MaRDI portal
Publication:1764727
DOI10.1016/j.amc.2003.12.058zbMath1069.65065OpenAlexW1983014909MaRDI QIDQ1764727
Publication date: 22 February 2005
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2003.12.058
Numerical experimentsGlobal convergenceConjugate gradient methodDescent methodCurve search ruleUnconstrained minimization
Related Items (5)
A new class of supermemory gradient methods ⋮ Convergence of supermemory gradient method ⋮ A hybrid-line-and-curve search globalization technique for inexact Newton methods ⋮ A descent algorithm without line search for unconstrained optimization ⋮ A new super-memory gradient method with curve search rule
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Differential optimization techniques
- Stepsize analysis for descent methods
- Augmentability in optimization theory
- Enlarging the region of convergence of Newton's method for constrained optimization
- A new and dynamic method for unconstrained minimization
- Supermemory descent methods for unconstrained minimization
- Differential gradient methods
- Minimum curvature multistep quasi-Newton methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A note on minimization problems and multistep methods
- Note on global convergence of ODE methods for unconstrained optimization
- New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
- Using function-values in multi-step quasi-Newton methods
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- A class on nonmonotone stabilization methods in unconstrained optimization
- Memory gradient method for the minimization of functions
- Relation between the memory gradient method and the Fletcher-Reeves method
- Study on a supermemory gradient method for the minimization of functions
- Quadratically convergent algorithms and one-dimensional search schemes
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Some convergence properties of the conjugate gradient method
- Restart procedures for the conjugate gradient method
- Numerical Optimization
- Function minimization by conjugate gradients
- Conjugate Directions without Linear Searches
This page was built for publication: A new descent algorithm with curve search rule