Stepsize analysis for descent methods
From MaRDI portal
Publication:1133148
DOI10.1007/BF00935546zbMath0421.49030MaRDI QIDQ1133148
Publication date: 1981
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Related Items (19)
Convergence of line search methods for unconstrained optimization ⋮ New inexact line search method for unconstrained optimization ⋮ Accelerating convergence in minisum location problem with \(\ell p\) norms ⋮ Error bound conditions and convergence of optimization methods on smooth and proximally smooth manifolds ⋮ A gradient-related algorithm with inexact line searches ⋮ New cautious BFGS algorithm based on modified Armijo-type line search ⋮ A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ Memory gradient method with Goldstein line search ⋮ Modified nonmonotone Armijo line search for descent method ⋮ A new descent algorithm with curve search rule ⋮ A new class of memory gradient methods with inexact line searches ⋮ Solution of location problems with radial cost functions ⋮ Comparing gradient descent with automatic differentiation and particle swarm optimization techniques for estimating tumor blood flow parameters in contrast-enhanced imaging ⋮ Computer Algebra and Line Search ⋮ Accelerating convergence in the Fermat-Weber location problem ⋮ On the convergence of the descent methods ⋮ Convergence of descent method without line search ⋮ A new super-memory gradient method with curve search rule
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Gradient methods of maximization
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Some inequalities involving the euclidean condition of a matrix
- Cauchy's method of minimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- On the Relative Efficiencies of Gradient Methods
- Convergence Conditions for Ascent Methods
This page was built for publication: Stepsize analysis for descent methods