Two new conjugate gradient methods based on modified secant equations

From MaRDI portal
Revision as of 19:42, 30 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:972741

DOI10.1016/J.CAM.2010.01.052zbMath1202.65071OpenAlexW2141102526WikidataQ57952896 ScholiaQ57952896MaRDI QIDQ972741

Nezam Mahdavi-Amiri, Saman Babaie-Kafaki, Reza Ghanbari

Publication date: 21 May 2010

Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/j.cam.2010.01.052




Related Items (47)

Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combinationA modified scaling parameter for the memoryless BFGS updating formulaA class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS updateNew nonlinear conjugate gradient methods based on optimal Dai-Liao parametersA new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao familyA modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equationsA modified Dai-Kou-type method with applications to signal reconstruction and blurred image restorationA class of one parameter conjugate gradient methodsA new smoothing spectral conjugate gradient method for solving tensor complementarity problemsTwo modified scaled nonlinear conjugate gradient methodsOn optimality of the parameters of self-scaling memoryless quasi-Newton updating formulaeAn accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problemsA Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equationsA Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained OptimizationA quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methodsA descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergenceA new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problemsA modified scaled memoryless symmetric rank-one methodA three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression modelOn a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applicationsCompetitive secant (BFGS) methods based on modified secant relations for unconstrained optimizationA new conjugate gradient algorithm for training neural networks based on a modified secant equationTwo-step conjugate gradient method for unconstrained optimizationSome modified Yabe–Takano conjugate gradient methods with sufficient descent conditionDescent Perry conjugate gradient methods for systems of monotone nonlinear equationsA modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimizationA new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective functionAn improved nonlinear conjugate gradient method with an optimal propertyTwo modified three-term conjugate gradient methods with sufficient descent propertyTwo optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equationsA limited memory descent Perry conjugate gradient methodA descent family of Dai–Liao conjugate gradient methodsA hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methodsA family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equationsA linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS techniqueThe Dai-Liao nonlinear conjugate gradient method with optimal parameter choicesA Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained OptimizationModified Hager–Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraintEnhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equationsA modified BFGS algorithm based on a hybrid secant equationA new accelerated conjugate gradient method for large-scale unconstrained optimizationScaled nonlinear conjugate gradient methods for nonlinear least squares problemsTwo hybrid nonlinear conjugate gradient methods based on a modified secant equationAn optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrixA hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameterAdaptive three-term family of conjugate residual methods for system of monotone nonlinear equationsA modified Perry conjugate gradient method and its global convergence


Uses Software



Cites Work




This page was built for publication: Two new conjugate gradient methods based on modified secant equations