A modified two-point stepsize gradient algorithm for unconstrained minimization
Publication:2867423
DOI10.1080/10556788.2012.667811zbMath1278.90448OpenAlexW2045423624MaRDI QIDQ2867423
Masoud Fatemi, Saman Babaie-Kafaki
Publication date: 19 December 2013
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2012.667811
unconstrained optimizationconvexitynonmonotone line searchmodified secant equationtwo-point stepsize gradient algorithm
Numerical mathematical programming methods (65K05) Numerical methods based on nonlinear programming (49M37) Convexity of real functions of several variables, generalizations (26B25) Methods of reduced gradient type (90C52)
Related Items (4)
Uses Software
Cites Work
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- On the limited memory BFGS method for large scale optimization
- A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule
- New quasi-Newton equation and related methods for unconstrained optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- On the asymptotic directions of the s-dimensional optimum gradient method
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Two-Point Step Size Gradient Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Barzilai and Borwein choice of steplength for the gradient method
- CUTEr and SifDec
- Methods of conjugate gradients for solving linear systems
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: A modified two-point stepsize gradient algorithm for unconstrained minimization