Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
From MaRDI portal
Publication:442712
DOI10.1016/j.cam.2012.01.036zbMath1258.65059OpenAlexW2170270075MaRDI QIDQ442712
Hiroshi Yabe, Yasushi Narushima
Publication date: 3 August 2012
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2012.01.036
unconstrained optimizationglobal convergenceconjugate gradient methodlarge-scaledescent search directionsecant condition
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items
On Hager and Zhang's conjugate gradient method with guaranteed descent, General four-step discrete-time zeroing and derivative dynamics applied to time-varying nonlinear optimization, Continuous and discrete Zhang dynamics for real-time varying nonlinear optimization, A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family, A modified conjugate gradient method based on a modified secant equation, A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations, Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing, Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization, Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition, A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function, A limited memory descent Perry conjugate gradient method, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, A hybrid self-adaptive conjugate first order reliability method for robust structural reliability analysis, A conjugate gradient method with sufficient descent property, A new accelerated conjugate gradient method for large-scale unconstrained optimization, An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix, A modified Perry conjugate gradient method and its global convergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- Technical Note—A Modified Conjugate Gradient Algorithm
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.