A memory gradient method based on the nonmonotone technique
From MaRDI portal
Publication:2628189
DOI10.3934/jimo.2016050zbMath1364.90318OpenAlexW2517709368MaRDI QIDQ2628189
Publication date: 12 June 2017
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/jimo.2016050
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- The convergence of conjugate gradient method with nonmonotone line search
- A nonmonotonic trust region algorithm for a class of semi-infinite minimax programming
- A nonmonotone trust region method for unconstrained optimization
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
- Global convergence of a memory gradient method for unconstrained optimization
- Strong global convergence of an adaptive nonmonotone memory gradient method
- Global convergence of the nonmonotone MBFGS method for nonconvex unconstrained minimization
- Nonmonotonic trust region algorithm
- Conjugate gradient methods using value of objective function for unconstrained optimization
- A new supermemory gradient method for unconstrained optimization problems
- Nonmonotone trust region method for solving optimization problems
- A new variant of the memory gradient method for unconstrained optimization
- A new descent memory gradient method and its global convergence
- An unconstrained optimization method using nonmonotone second order Goldstein's line search
- Incorporating nonmonotone strategies into the trust region method for unconstrained optimization
- On memory gradient method with trust region for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- A new super-memory gradient method with curve search rule
- A new nonmonotone trust-region method of conic model for solving unconstrained optimization
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Two-Point Step Size Gradient Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Numerical Optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization
- Benchmarking optimization software with performance profiles.
- On the nonmonotone line search
This page was built for publication: A memory gradient method based on the nonmonotone technique