Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization

From MaRDI portal
Revision as of 04:05, 30 January 2024 by Import240129110155 (talk | contribs) (Created automatically from import240129110155)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:438775

DOI10.1007/s10957-011-9960-xzbMath1262.90170OpenAlexW2039084584MaRDI QIDQ438775

Hiroshi Yabe, Yasushi Narushima, Kaori Sugiki

Publication date: 31 July 2012

Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10957-011-9960-x




Related Items (43)

Comment on ``A new three-term conjugate gradient method for unconstrained problemAn accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy conditionA self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraintsA class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS updateTwo sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensingA new descent algorithm using the three-step discretization method for solving unconstrained optimization problemsA hybrid quasi-Newton method with application in sparse recoveryA new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimizationAn Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient MethodNew three-term conjugate gradient algorithm for solving monotone nonlinear equations and signal recovery problemsA new modified three-term conjugate gradient method with sufficient descent property and its global convergenceNonmonotone quasi-Newton-based conjugate gradient methods with application to signal processingA modified scaled memoryless symmetric rank-one methodA family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditionsMemoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimizationA restart scheme for the memoryless BFGS methodA descent family of the spectral Hestenes–Stiefel method by considering the quasi-Newton methodConjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimizationGlobal convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recoveryA new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective functionUnnamed ItemA new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergenceTwo modified three-term conjugate gradient methods with sufficient descent propertyAn efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient methodNew version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search directionAnalysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensingA limited memory descent Perry conjugate gradient methodA modified three-term conjugate gradient method with sufficient descent propertyTwo--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step lengthA scaled three-term conjugate gradient method for large-scale unconstrained optimization problemSome nonlinear conjugate gradient methods based on spectral scaling secant equationsA linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS techniqueA new three-term conjugate gradient algorithm for unconstrained optimizationAn augmented memoryless BFGS method based on a modified secant equation with application to compressed sensingThe Dai-Liao nonlinear conjugate gradient method with optimal parameter choicesA family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimizationDescent Symmetrization of the Dai–Liao Conjugate Gradient MethodA new accelerated conjugate gradient method for large-scale unconstrained optimizationScaled nonlinear conjugate gradient methods for nonlinear least squares problemsDiagonally scaled memoryless quasi-Newton methods with application to compressed sensingA one-parameter class of three-term conjugate gradient methods with an adaptive parameter choiceDai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processingA modified Perry conjugate gradient method and its global convergence


Uses Software


Cites Work




This page was built for publication: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization