Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
DOI10.1007/S10957-011-9960-XzbMATH Open1262.90170OpenAlexW2039084584MaRDI QIDQ438775FDOQ438775
Authors: Kaori Sugiki, Yasushi Narushima, Hiroshi Yabe
Publication date: 31 July 2012
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-011-9960-x
Recommendations
- Three-term conjugate gradient methods based on secant conditions for unconstrained optimization
- Descent three-term conjugate gradient methods based on secant conditions for unconstrained optimization
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
global convergenceunconstrained optimization problemsdescent search directionssecant conditionsthree-term conjugate gradient methods
Cites Work
- Algorithm 851
- Testing Unconstrained Optimization Software
- CUTE
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- Function minimization by conjugate gradients
- Line search algorithms with guaranteed sufficient decrease
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Technical Note—A Modified Conjugate Gradient Algorithm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- A Two-Term PRP-Based Descent Method
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A survey of nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Convergence theory for the structured BFGS secant method with an application to nonlinear least squares
- Multi-step quasi-Newton methods for optimization
- Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
- On the Use of Product Structure in Secant Methods for Nonlinear Least Squares Problems
- A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
Cited In (48)
- A modified three-term conjugate gradient method with sufficient descent property
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing
- A hybrid quasi-Newton method with application in sparse recovery
- A new self-scaling memoryless quasi-Newton update for unconstrained optimization
- Descent three-term conjugate gradient methods based on secant conditions for unconstrained optimization
- A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- Comment on ``A new three-term conjugate gradient method for unconstrained problem
- A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- A family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditions
- Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing
- A modified Perry conjugate gradient method and its global convergence
- A restart scheme for the memoryless BFGS method
- A descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton method
- An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method
- A new modified three-term conjugate gradient method with sufficient descent property and its global convergence
- A new descent algorithm using the three-step discretization method for solving unconstrained optimization problems
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing
- An efficient adaptive three-term extension of the Hestenes-Stiefel conjugate gradient method
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- Descent symmetrization of the Dai-Liao conjugate gradient method
- Three-term conjugate gradient methods based on secant conditions for unconstrained optimization
- A new accelerated conjugate gradient method for large-scale unconstrained optimization
- Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
- An extended version of the memoryless DFP algorithm with the sufficient descent property
- Two modified three-term conjugate gradient methods with sufficient descent property
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- A modified scaled memoryless symmetric rank-one method
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A new version of augmented self-scaling BFGS method
- Global convergence of three-term conjugate gradient methods on general functions under a new inexact line search strategy
- Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization
- New three-term conjugate gradient algorithm for solving monotone nonlinear equations and signal recovery problems
- Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- A limited memory descent Perry conjugate gradient method
Uses Software
This page was built for publication: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q438775)