Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
From MaRDI portal
Publication:438775
DOI10.1007/s10957-011-9960-xzbMath1262.90170OpenAlexW2039084584MaRDI QIDQ438775
Hiroshi Yabe, Yasushi Narushima, Kaori Sugiki
Publication date: 31 July 2012
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-011-9960-x
global convergenceunconstrained optimization problemsdescent search directionssecant conditionsthree-term conjugate gradient methods
Related Items (43)
Comment on ``A new three-term conjugate gradient method for unconstrained problem ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing ⋮ A new descent algorithm using the three-step discretization method for solving unconstrained optimization problems ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization ⋮ An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method ⋮ New three-term conjugate gradient algorithm for solving monotone nonlinear equations and signal recovery problems ⋮ A new modified three-term conjugate gradient method with sufficient descent property and its global convergence ⋮ Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing ⋮ A modified scaled memoryless symmetric rank-one method ⋮ A family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditions ⋮ Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization ⋮ A restart scheme for the memoryless BFGS method ⋮ A descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton method ⋮ Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization ⋮ Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery ⋮ A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function ⋮ Unnamed Item ⋮ A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence ⋮ Two modified three-term conjugate gradient methods with sufficient descent property ⋮ An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method ⋮ New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction ⋮ Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing ⋮ A limited memory descent Perry conjugate gradient method ⋮ A modified three-term conjugate gradient method with sufficient descent property ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length ⋮ A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem ⋮ Some nonlinear conjugate gradient methods based on spectral scaling secant equations ⋮ A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique ⋮ A new three-term conjugate gradient algorithm for unconstrained optimization ⋮ An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing ⋮ The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices ⋮ A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization ⋮ Descent Symmetrization of the Dai–Liao Conjugate Gradient Method ⋮ A new accelerated conjugate gradient method for large-scale unconstrained optimization ⋮ Scaled nonlinear conjugate gradient methods for nonlinear least squares problems ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing ⋮ A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice ⋮ Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing ⋮ A modified Perry conjugate gradient method and its global convergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- Convergence theory for the structured BFGS secant method with an application to nonlinear least squares
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- Technical Note—A Modified Conjugate Gradient Algorithm
- Testing Unconstrained Optimization Software
- Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Use of Product Structure in Secant Methods for Nonlinear Least Squares Problems
- CUTE
- Line search algorithms with guaranteed sufficient decrease
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- A Two-Term PRP-Based Descent Method
- A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization