Modified nonmonotone Armijo line search for descent method
From MaRDI portal
Publication:535246
DOI10.1007/s11075-010-9408-7zbMath1228.65092OpenAlexW2056758172MaRDI QIDQ535246
Publication date: 11 May 2011
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-010-9408-7
unconstrained optimizationglobal convergencenumerical resultsconvex optimizationmatrix-free non-monotone Armijo line search
Related Items (6)
A non-monotone pattern search approach for systems of nonlinear equations ⋮ A linearly convergent algorithm for sparse signal reconstruction ⋮ Accelerating optimization by tracing valley ⋮ A Shamanskii-like self-adaptive Levenberg-Marquardt method for nonlinear equations ⋮ An inexact line search approach using modified nonmonotone strategy for unconstrained optimization ⋮ A new class of nonmonotone conjugate gradient training algorithms
Uses Software
Cites Work
- Global convergence of nonmonotone descent methods for unconstrained optimization problems
- New inexact line search method for unconstrained optimization
- A truncated Newton method with non-monotone line search for unconstrained optimization
- Stepsize analysis for descent methods
- Nonmonotone trust region methods with curvilinear path in unconstrained optimization
- Nonmonotonic trust region algorithm
- Optimization. Algorithms and consistent approximations
- Some convergence properties of descent methods
- Non-monotone trust-region algorithms for nonlinear optimization subject to convex constraints
- A nonmonotone adaptive trust region method and its convergence
- Convergence of line search methods for unconstrained optimization
- A nonmonotone trust region algorithm for equality constrained optimization
- A nonmonotone trust region algorithm for unconstrained nonsmooth optimization
- A new unconstrained optimization method for imprecise function and gradient values
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- Convergence of nonmonotone line search method
- Minimization of functions having Lipschitz continuous first partial derivatives
- Convergence of descent method without line search
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Avoiding the Maratos Effect by Means of a Nonmonotone Line Search I. General Constrained Problems
- Two-Point Step Size Gradient Methods
- Testing Unconstrained Optimization Software
- On Convergence Properties of Algorithms for Unconstrained Minimization
- Numerical Optimization
- CUTE
- A Nonmonotone Line Search Technique for Newton’s Method
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:
- On the Barzilai and Borwein choice of steplength for the gradient method
- GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- On Steepest Descent
- Benchmarking optimization software with performance profiles.
- On the nonmonotone line search
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- Unnamed Item
- Unnamed Item
This page was built for publication: Modified nonmonotone Armijo line search for descent method