A descent algorithm without line search for unconstrained optimization
From MaRDI portal
Publication:1044422
DOI10.1016/j.amc.2009.08.058zbMath1181.65090MaRDI QIDQ1044422
Publication date: 18 December 2009
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2009.08.058
Uses Software
Cites Work
- Unnamed Item
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- Estimation of the optimal constants and the thickness of thin films using unconstrained optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- A new descent algorithm with curve search rule
- Convergence of line search methods for unconstrained optimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- Optimization theory and methods. Nonlinear programming
- Convergence of descent method without line search
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Testing Unconstrained Optimization Software
- Gradient Method with Retards and Generalizations
- Numerical Optimization
- Alternate minimization gradient method
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- On the Barzilai and Borwein choice of steplength for the gradient method
- Conjugate Directions without Linear Searches
- Minimization algorithms based on supervisor and searcher cooperation
- Adaptive two-point stepsize gradient algorithm
- Global convergence of conjugate gradient methods without line search