New inexact line search method for unconstrained optimization
From MaRDI portal
Publication:850832
DOI10.1007/s10957-005-6553-6zbMath1116.90097MaRDI QIDQ850832
Publication date: 6 November 2006
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/2027.42/45195
Related Items
Modified nonmonotone Armijo line search for descent method, Nonmonotone adaptive trust region method, Convergence of descent method with new line search, A new class of supermemory gradient methods, A new trust region method with adaptive radius, A new robust line search technique based on Chebyshev polynomials, The convergence of subspace trust region methods, A new non-monotone self-adaptive trust region method for unconstrained optimization, Convergence of nonmonotone line search method, A new family of conjugate gradient methods, Global convergence of conjugate gradient method
Uses Software
Cites Work
- Global convergence of nonmonotone descent methods for unconstrained optimization problems
- Stepsize analysis for descent methods
- Some convergence properties of descent methods
- Convergence of line search methods for unconstrained optimization
- A new unconstrained optimization method for imprecise function and gradient values
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Testing Unconstrained Optimization Software
- On Convergence Properties of Algorithms for Unconstrained Minimization
- On the Barzilai and Borwein choice of steplength for the gradient method
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- On the nonmonotone line search