Study on a supermemory gradient method for the minimization of functions
From MaRDI portal
Publication:2532077
DOI10.1007/BF00930579zbMATH Open0172.19002OpenAlexW2030071347MaRDI QIDQ2532077FDOQ2532077
Authors: E. E. Cragg, A. V. Levy
Publication date: 1969
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00930579
Cites Work
Cited In (32)
- Global convergence of a memory gradient method for unconstrained optimization
- The Topkis-Veinott algorithm for solving nonlinear programs with lower and upper bounded variables
- Generalized memory gradient projection method for non-linear programming with non-linear equality and in-equality constraints
- Generalized conjugate directions
- Global convergence of a memory gradient method without line search
- New conjugate gradient-like methods for unconstrained optimization
- Global convergence of modified HS conjugate gradient method
- A new class of supermemory gradient methods
- Memory gradient method with Goldstein line search
- A generalized super-memory gradient projection method of strongly sub-feasible directions with strong convergence for nonlinear inequality constrained optimization
- Conjugate gradient methods using value of objective function for unconstrained optimization
- Strong global convergence of an adaptive nonmonotone memory gradient method
- A gradient-related algorithm with inexact line searches
- A local MM subspace method for solving constrained variational problems in image recovery
- Optimal simultaneous maximuma posterioriestimation of states, noise statistics and parameters I. Algorithm
- A new descent algorithm with curve search rule
- Supermemory descent methods for unconstrained minimization
- Convergence of supermemory gradient method
- A heuristic iterated-subspace minimization method with pattern search for unconstrained optimization
- On the convergence of a new hybrid projection algorithm
- Pseudo-conjugate directions for the solution of the nonlinear unconstrained optimization problem on a parallel computer
- Quadratically convergent algorithms and one-dimensional search schemes
- On memory gradient method with trust region for unconstrained optimization
- A supermemory gradient projection algorithm for optimization problems with nonlinear constraints
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- Testing a Class of Methods for Solving Minimization Problems with Simple Bounds on the Variables
- Memory gradient method for multiobjective optimization
- Numerical experiments on dual matrix algorithms for function minimization
- Numerical experiments on quadratically convergent algorithms for function minimization
- Approximation methods for the unconstrained optimization
- Extensions of CGS algorithms: Generalized least-square solutions
- A nonmonotone supermemory gradient algorithm for unconstrained optimization
This page was built for publication: Study on a supermemory gradient method for the minimization of functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2532077)