A new class of memory gradient methods with inexact line searches
DOI10.1515/1569395054069008zbMath1072.65086OpenAlexW2026257541MaRDI QIDQ4675852
No author found.
Publication date: 6 May 2005
Published in: Journal of Numerical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1515/1569395054069008
algorithmsunconstrained optimizationglobal convergencecomparison of methodsconvergence accelerationnumerical experimentsinexact line searchPolak-Ribière methodFletcher-Reeves methodmemory gradient methodlarge scale minimizationArmijo's ruleWolfe's rule
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Methods of reduced gradient type (90C52)
Related Items (2)
Uses Software
Cites Work
- On the limited memory BFGS method for large scale optimization
- Stepsize analysis for descent methods
- A class of nonmonotone conjugate gradient methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- Minimum curvature multistep quasi-Newton methods
- Combining search directions using gradient flows
- A numerical study of limited memory BFGS methods
- New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
- Convergence of line search methods for unconstrained optimization
- Using function-values in multi-step quasi-Newton methods
- A class on nonmonotone stabilization methods in unconstrained optimization
- Quadratically convergent algorithms and one-dimensional search schemes
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Numerical Optimization
- Global and superlinear convergence of an algorithm for one-dimensional minimization of convex functions
- Conjugate Directions without Linear Searches
- A nonlinear model for function-value multistep methods
- Unnamed Item
- Unnamed Item
This page was built for publication: A new class of memory gradient methods with inexact line searches