New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
From MaRDI portal
Publication:875393
DOI10.1016/j.cam.2006.03.005zbMath1116.65069OpenAlexW2083770700WikidataQ59241592 ScholiaQ59241592MaRDI QIDQ875393
Chun-Ming Tang, Zeng-xin Wei, Guoyin Li
Publication date: 13 April 2007
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2006.03.005
algorithmunconstrained optimizationglobal convergencenumerical examplesconjugate gradient methodconjugacy conditionquasi-Newton equation
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items
On Hager and Zhang's conjugate gradient method with guaranteed descent, Sufficient descent conjugate gradient methods for large-scale optimization problems, An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition, A modified scaling parameter for the memoryless BFGS updating formula, A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method, A conjugate gradient type method for the nonnegative constraints optimization problems, A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems, New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters, A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family, A modified conjugate gradient method based on a modified secant equation, Unnamed Item, Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization, An improved Perry conjugate gradient method with adaptive parameter choice, Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization, A class of one parameter conjugate gradient methods, A self-adjusting spectral conjugate gradient method for large-scale unconstrained optimization, A distributed conjugate gradient online learning method over networks, An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix, Two modified scaled nonlinear conjugate gradient methods, On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae, Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems, A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations, A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods, A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence, On a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applications, A variant spectral-type FR conjugate gradient method and its global convergence, A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization, A new conjugate gradient algorithm for training neural networks based on a modified secant equation, Two-step conjugate gradient method for unconstrained optimization, Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition, Descent Perry conjugate gradient methods for systems of monotone nonlinear equations, Globally convergent modified Perry's conjugate gradient method, Two effective hybrid conjugate gradient algorithms based on modified BFGS updates, A modified conjugacy condition and related nonlinear conjugate gradient method, A hybrid of DL and WYL nonlinear conjugate gradient methods, Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing, Extension of modified Polak-Ribière-Polyak conjugate gradient method to linear equality constraints minimization problems, An improved nonlinear conjugate gradient method with an optimal property, A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method, An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition, A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems, A limited memory descent Perry conjugate gradient method, A new search procedure of steepest ascent in response surface exploration, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search, A modified three-term conjugate gradient method with sufficient descent property, Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization, Two new conjugate gradient methods based on modified secant equations, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, A descent family of Dai–Liao conjugate gradient methods, Convergence analysis of a modified BFGS method on convex minimizations, An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition, A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization, Spectral three-term constrained conjugate gradient algorithm for function minimizations, A conjugate gradient method with sufficient descent property, Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations, Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations, Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, Two hybrid nonlinear conjugate gradient methods based on a modified secant equation, An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix, A hybrid conjugate gradient method with descent property for unconstrained optimization, A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method, Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property, A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter, Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations, A NEW DERIVATIVE-FREE CONJUGATE GRADIENT METHOD FOR LARGE-SCALE NONLINEAR SYSTEMS OF EQUATIONS, Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization, A modified Perry conjugate gradient method and its global convergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global convergence of a two-parameter family of conjugate gradient methods without line search
- A new method for nonsmooth convex optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- A three-parameter family of nonlinear conjugate gradient methods
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence analysis of a proximal newton method1
- Proximité et dualité dans un espace hilbertien
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- New conjugacy conditions and related nonlinear conjugate gradient methods