Global convergence properties of nonlinear conjugate gradient methods with modified secant condition

From MaRDI portal
Revision as of 11:53, 1 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1876593

DOI10.1023/B:COAP.0000026885.81997.88zbMath1056.90130OpenAlexW2029493470MaRDI QIDQ1876593

Masahiro Takano, Hiroshi Yabe

Publication date: 20 August 2004

Published in: Computational Optimization and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1023/b:coap.0000026885.81997.88






Related Items (67)

On Hager and Zhang's conjugate gradient method with guaranteed descentSufficient descent conjugate gradient methods for large-scale optimization problemsSpectral method and its application to the conjugate gradient methodAn accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy conditionA class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS updateA family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS methodNew nonlinear conjugate gradient methods based on optimal Dai-Liao parametersA new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao familySufficient descent nonlinear conjugate gradient methods with conjugacy conditionA modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equationsA modified conjugate gradient method based on a modified secant equationA class of one parameter conjugate gradient methodsAn adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrixAn accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problemsTwo new Dai-Liao-type conjugate gradient methods for unconstrained optimization problemsA modified conjugate gradient parameter via hybridization approach for solving large-scale systems of nonlinear equationsA class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problemsA Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equationsA descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergenceAn adaptive modified three-term conjugate gradient method with global convergenceA three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression modelOn a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applicationsA new conjugate gradient algorithm for training neural networks based on a modified secant equationTwo-step conjugate gradient method for unconstrained optimizationSome modified Yabe–Takano conjugate gradient methods with sufficient descent conditionDescent Perry conjugate gradient methods for systems of monotone nonlinear equationsGlobally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimizationConjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimizationA modified conjugate gradient method for general convex functionsA modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimizationA new conjugate gradient method based on quasi-Newton equation for unconstrained optimizationA modified conjugacy condition and related nonlinear conjugate gradient methodA survey of gradient methods for solving nonlinear optimizationAn improved nonlinear conjugate gradient method with an optimal propertyA novel value for the parameter in the Dai-Liao-type conjugate gradient methodA modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton methodTwo optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equationsMulti-step nonlinear conjugate gradient methods for unconstrained minimizationA limited memory descent Perry conjugate gradient methodConjugate gradient methods using value of objective function for unconstrained optimizationNonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problemsTwo new conjugate gradient methods based on modified secant equationsSome nonlinear conjugate gradient methods based on spectral scaling secant equationsA new globalization technique for nonlinear conjugate gradient methods for nonconvex minimizationA descent family of Dai–Liao conjugate gradient methodsA family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equationsThe Dai-Liao nonlinear conjugate gradient method with optimal parameter choicesA modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy conditionA family of accelerated hybrid conjugate gradient method for unconstrained optimization and image restorationConvergence of the descent Dai–Yuan conjugate gradient method for unconstrained optimizationA Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained OptimizationOn the extension of Dai-Liao conjugate gradient method for vector optimizationA conjugate gradient method with sufficient descent propertyModified Hager–Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraintEnhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equationsA modified BFGS algorithm based on a hybrid secant equationA new accelerated conjugate gradient method for large-scale unconstrained optimizationNonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimizationA truncated descent HS conjugate gradient method and its global convergenceScaled nonlinear conjugate gradient methods for nonlinear least squares problemsA modified bat algorithm with conjugate gradient method for global optimizationA new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective functionA nonlinear conjugate gradient method based on the MBFGS secant conditionA new hybrid algorithm for convex nonlinear unconstrained optimizationA three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton methodA hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameterA modified Perry conjugate gradient method and its global convergence


Uses Software






This page was built for publication: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition