A new class of memory gradient methods with inexact line searches
DOI10.1515/1569395054069008zbMATH Open1072.65086OpenAlexW2026257541MaRDI QIDQ4675852FDOQ4675852
Authors:
Publication date: 6 May 2005
Published in: Journal of Numerical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1515/1569395054069008
Recommendations
algorithmsconvergence accelerationglobal convergenceunconstrained optimizationnumerical experimentscomparison of methodsinexact line searchFletcher-Reeves methodmemory gradient methodlarge scale minimizationArmijo's rulePolak-Ribière methodWolfe's rule
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Methods of reduced gradient type (90C52)
Cites Work
- Testing Unconstrained Optimization Software
- Numerical Optimization
- On the limited memory BFGS method for large scale optimization
- Title not available (Why is that?)
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A numerical study of limited memory BFGS methods
- A class on nonmonotone stabilization methods in unconstrained optimization
- Stepsize analysis for descent methods
- Multi-step quasi-Newton methods for optimization
- Convergence of line search methods for unconstrained optimization
- Using function-values in multi-step quasi-Newton methods
- New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
- Conjugate Directions without Linear Searches
- Combining search directions using gradient flows
- Quadratically convergent algorithms and one-dimensional search schemes
- A class of nonmonotone conjugate gradient methods for unconstrained optimization
- Minimum curvature multistep quasi-Newton methods
- Title not available (Why is that?)
- Global and superlinear convergence of an algorithm for one-dimensional minimization of convex functions
- A nonlinear model for function-value multistep methods
Cited In (8)
- Title not available (Why is that?)
- Global convergence of a memory gradient method without line search
- Strong global convergence of an adaptive nonmonotone memory gradient method
- A new class of memory gradient methods with Wolfe line search
- Title not available (Why is that?)
- A memory efficient incremental gradient method for regularized minimization
- Title not available (Why is that?)
- Exact gradient methods with memory
Uses Software
This page was built for publication: A new class of memory gradient methods with inexact line searches
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4675852)