A conjugate gradient method based on a modified secant relation for unconstrained optimization
From MaRDI portal
Publication:4959904
Recommendations
- A modified conjugate gradient method for unconstrained optimization
- A modified conjugate gradient method based on a modified secant equation
- Two new conjugate gradient methods based on modified secant equations
- A modified form of conjugate gradient method for unconstrained optimization problems
- A new family of Dai-Liao conjugate gradient methods with modified secant equation for unconstrained optimization
Cites work
- scientific article; zbMATH DE number 3843083 (Why is no real title available?)
- scientific article; zbMATH DE number 2063453 (Why is no real title available?)
- scientific article; zbMATH DE number 778130 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
- A modified BFGS method and its global convergence in nonconvex minimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A note on global convergence result for conjugate gradient methods
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- CUTEr and SifDec
- Convergence analysis of a modified BFGS method on convex minimizations
- Efficient generalized conjugate gradient algorithms. I: Theory
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence result for conjugate gradient methods
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Methods of conjugate gradients for solving linear systems
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- New conjugacy conditions and related nonlinear conjugate gradient methods
- New quasi-Newton equation and related methods for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- On the sufficient descent condition of the Hager-Zhang conjugate gradient methods
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- The conjugate gradient method in extremal problems
- Two new conjugate gradient methods based on modified secant equations
Cited in
(9)- The modified BFGS method with new secant relation for unconstrained optimization problems
- Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization
- Two-step conjugate gradient method for unconstrained optimization
- An efficient mixed conjugate gradient method for solving unconstrained optimisation problems
- A modified conjugate gradient method based on a modified secant equation
- A second-derivative-free modified secant-like method with order 2.732\dots\ for unconstrained optimization
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
- An online conjugate gradient algorithm for large-scale data analysis in machine learning
- A monotone gradient method via weak secant equation for unconstrained optimization
This page was built for publication: A conjugate gradient method based on a modified secant relation for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4959904)