Stopping criteria for linesearch methods without derivatives
From MaRDI portal
Publication:3696884
DOI10.1007/BF02591934zbMath0576.90087OpenAlexW1979802267MaRDI QIDQ3696884
Renato De Leone, Manlio Gaudioso
Publication date: 1984
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02591934
unconstrained optimizationstopping criteriadirectional derivativenonderivative methodslinesearch stepsizestep-length selection
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (20)
Globally convergent block-coordinate techniques for unconstrained optimization ⋮ Global convergence and stabilization of unconstrained minimization methods without derivatives ⋮ Nonmonotone derivative-free methods for nonlinear equations ⋮ Perturbed steepest-descent technique in multiextremal problems ⋮ Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method ⋮ A globally convergent version of the Polak-Ribière conjugate gradient method ⋮ Convergence properties of the dependent PRP conjugate gradient methods ⋮ Use of the minimum norm search direction in a nonmonotone version of the Gauss-Newton method ⋮ Global convergence properties of two modified BFGS-type methods ⋮ A bundle-type method for nonsmooth DC programs ⋮ A pattern search and implicit filtering algorithm for solving linearly constrained minimization problems with noisy objective functions ⋮ A local search method for costly black-box problems and its application to CSP plant start-up optimization refinement ⋮ Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef: ⋮ Spectral gradient method for impulse noise removal ⋮ Global convergence technique for the Newton method with periodic Hessian evaluation ⋮ A class of derivative-free nonmonotone optimization algorithms employing coordinate rotations and gradient approximations ⋮ A unified convergence framework for nonmonotone inexact decomposition methods ⋮ Minimizing Piecewise-Concave Functions Over Polyhedra ⋮ Derivative-free optimization methods ⋮ Globally convergent diagonal Polak-Ribière-Polyak like algorithm for nonlinear equations
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Diagonalized multiplier methods and quasi-Newton methods for constrained optimization
- An effective algorithm for minimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- A note on a sufficient-decrease criterion for a non-derivative step-length procedure
- A superlinearly convergent algorithm for minimization without evaluating derivatives
- Convergence Conditions for Ascent Methods
This page was built for publication: Stopping criteria for linesearch methods without derivatives