A new super-memory gradient method with curve search rule
From MaRDI portal
Publication:2571993
DOI10.1016/j.amc.2004.10.063zbMath1081.65058OpenAlexW2088057602MaRDI QIDQ2571993
Publication date: 14 November 2005
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2004.10.063
Related Items
A memory gradient method based on the nonmonotone technique ⋮ A new class of supermemory gradient methods ⋮ Memory gradient method for multiobjective optimization ⋮ Convergence of supermemory gradient method ⋮ A generalized super-memory gradient projection method of strongly sub-feasible directions with strong convergence for nonlinear inequality constrained optimization ⋮ Memory gradient method with Goldstein line search ⋮ Supermemory gradient methods for monotone nonlinear equations with convex constraints ⋮ A new supermemory gradient method for unconstrained optimization problems ⋮ A nonmonotone supermemory gradient algorithm for unconstrained optimization ⋮ A memory gradient method for non-smooth convex optimization
Cites Work
- Unnamed Item
- Unnamed Item
- A gradient-related algorithm with inexact line searches
- Differential optimization techniques
- Stepsize analysis for descent methods
- A new and dynamic method for unconstrained minimization
- A Newton-type curvilinear search method for optimization
- A curvilinear optimization method based upon iterative estimation of the eigensystem of the Hessian matrix
- Differential gradient methods
- A class of methods for unconstrained minimization based on stable numerical integration techniques
- Convergence properties of the Beale-Powell restart algorithm
- Multi-step quasi-Newton methods for optimization
- Minimum curvature multistep quasi-Newton methods
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A note on minimization problems and multistep methods
- A new descent algorithm with curve search rule
- Note on global convergence of ODE methods for unconstrained optimization
- New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
- Using function-values in multi-step quasi-Newton methods
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- A class on nonmonotone stabilization methods in unconstrained optimization
- Quadratically convergent algorithms and one-dimensional search schemes
- Solving Convex Programs by Means of Ordinary Differential Equations
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- An Arc Method for Nonlinear Programming
- A new arc algorithm for unconstrained optimization
- Numerical Optimization
- Conjugate Directions without Linear Searches