Global convergence of nonmonotone descent methods for unconstrained optimization problems
From MaRDI portal
Publication:697549
DOI10.1016/S0377-0427(02)00420-XzbMath1007.65044OpenAlexW2018624535MaRDI QIDQ697549
Jie Sun, Ji-ye Han, Wen-Yu Sun
Publication date: 17 September 2002
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0377-0427(02)00420-x
Related Items (41)
Nonmonotone trust region method for solving optimization problems ⋮ Parallel variable distribution algorithm for constrained optimization with nonmonotone technique ⋮ New inexact line search method for unconstrained optimization ⋮ Two accelerated nonmonotone adaptive trust region line search methods ⋮ Non-monotone algorithm for minimization on arbitrary domains with applications to large-scale orthogonal Procrustes problem ⋮ A Modified Non-Monotone BFGS Method for Non-Convex Unconstrained Optimization ⋮ Strong global convergence of an adaptive nonmonotone memory gradient method ⋮ A nonmonotone trust region method with new inexact line search for unconstrained optimization ⋮ A nonmonotone line search slackness technique for unconstrained optimization ⋮ The convergence of conjugate gradient method with nonmonotone line search ⋮ An adaptive nonmonotone line search technique for solving systems of nonlinear equations ⋮ New cautious BFGS algorithm based on modified Armijo-type line search ⋮ Adaptive nonmonotone line search method for unconstrained optimization ⋮ Nonmonotone algorithm for minimization on closed sets with applications to minimization on Stiefel manifolds ⋮ Nonmonotone second-order Wolfe's line search method for unconstrained optimization problems ⋮ A modified trust region method with beale's PCG technique for optimization ⋮ Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing ⋮ A Trust Region Algorithm with Memory for Equality Constrained Optimization ⋮ A new nonmonotone line search method for nonsmooth nonconvex optimization ⋮ Convergence of memory gradient methods ⋮ A nonmonotone line search filter method with reduced Hessian updating for nonlinear optimization ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ A new class of nonmonotone conjugate gradient training algorithms ⋮ An unconstrained optimization method using nonmonotone second order Goldstein's line search ⋮ Multivariate spectral gradient method for unconstrained optimization ⋮ Convergence of PRP method with new nonmonotone line search ⋮ A NONMONOTONE FILTER BARZILAI-BORWEIN METHOD FOR OPTIMIZATION ⋮ Globalizing a nonsmooth Newton method via nonmonotone path search ⋮ A new nonmonotone line search technique for unconstrained optimization ⋮ A nonmonotone trust region method based on nonincreasing technique of weighted average of the successive function values ⋮ Modified nonmonotone Armijo line search for descent method ⋮ Nonmonotone adaptive trust region method based on simple conic model for unconstrained optimization ⋮ Application of scaled nonlinear conjugate-gradient algorithms to the inverse natural convection problem ⋮ A kind of nonmonotone filter method for nonlinear complementarity problem ⋮ Nonmonotone adaptive trust-region method for unconstrained optimization problems ⋮ A smoothing and regularization Broyden-like method for nonlinear inequalities ⋮ An adaptive conic trust-region method for unconstrained optimization ⋮ A new family of conjugate gradient methods ⋮ A modified SQP method with nonmonotone technique and its global convergence ⋮ A nonmonotone line search method and its convergence for unconstrained optimization ⋮ Convergence of descent method with new line search
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A truncated Newton method with non-monotone line search for unconstrained optimization
- Über die globale Konvergenz von Variable-Metrik-Verfahren mit nicht- exakter Schrittweitenbestimmung
- Nonmonotonic trust region algorithm
- Trust region algorithm for nonsmooth optimization
- Avoiding the Maratos Effect by Means of a Nonmonotone Line Search I. General Constrained Problems
- The watchdog technique for forcing convergence in algorithms for constrained optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- A nonmonotone inexact Newton algorithm for nonlinear systems of equations
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:
This page was built for publication: Global convergence of nonmonotone descent methods for unconstrained optimization problems