Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization

From MaRDI portal
Publication:438775

DOI10.1007/s10957-011-9960-xzbMath1262.90170OpenAlexW2039084584MaRDI QIDQ438775

Hiroshi Yabe, Yasushi Narushima, Kaori Sugiki

Publication date: 31 July 2012

Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10957-011-9960-x



Related Items

Comment on ``A new three-term conjugate gradient method for unconstrained problem, An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition, A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints, A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing, A new descent algorithm using the three-step discretization method for solving unconstrained optimization problems, A hybrid quasi-Newton method with application in sparse recovery, A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization, An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method, New three-term conjugate gradient algorithm for solving monotone nonlinear equations and signal recovery problems, A new modified three-term conjugate gradient method with sufficient descent property and its global convergence, Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing, A modified scaled memoryless symmetric rank-one method, A family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditions, Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization, A restart scheme for the memoryless BFGS method, A descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton method, Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization, Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery, A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function, Unnamed Item, A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence, Two modified three-term conjugate gradient methods with sufficient descent property, An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method, New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction, Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing, A limited memory descent Perry conjugate gradient method, A modified three-term conjugate gradient method with sufficient descent property, Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length, A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique, A new three-term conjugate gradient algorithm for unconstrained optimization, An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, Descent Symmetrization of the Dai–Liao Conjugate Gradient Method, A new accelerated conjugate gradient method for large-scale unconstrained optimization, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing, A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice, Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing, A modified Perry conjugate gradient method and its global convergence


Uses Software


Cites Work