On the Convergence of a New Conjugate Gradient Algorithm
From MaRDI portal
Publication:4196255
Cited in
(64)- A new subspace minimization conjugate gradient method for unconstrained minimization
- An adaptive modified three-term conjugate gradient method with global convergence
- A New Dai-Liao Conjugate Gradient Method based on Approximately Optimal Stepsize for Unconstrained Optimization
- An overview of nonlinear optimization
- An efficient inertial subspace minimization CG algorithm with convergence rate analysis for constrained nonlinear monotone equations
- A family of limited memory three term conjugate gradient methods
- An inertial spectral CG projection method based on the memoryless BFGS update
- Global convergence of conjugate gradient method
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
- Convergence of the Polak-Ribiére-Polyak conjugate gradient method
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization
- On Hager and Zhang's conjugate gradient method with guaranteed descent
- A new nonmonotone line search technique for unconstrained optimization
- Symmetric Perry conjugate gradient method
- Scaled conjugate gradient algorithms for unconstrained optimization
- A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problems
- Some new three-term Hestenes-Stiefel conjugate gradient methods with affine combination
- A modified Hestenes-Stiefel conjugate gradient method with an optimal property
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- Stochastic heavy-ball method for constrained stochastic optimization problems
- Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
- Globally convergent conjugate gradient algorithms
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- Multi-time-step and two-scale domain decomposition method for non-linear structural dynamics
- Some remarks on conjugate gradient methods without line search
- On the method of shortest residuals for unconstrained optimization
- A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A new three-term conjugate gradient method with descent direction for unconstrained optimization
- Convergence conditions, line search algorithms and trust region implementations for the Polak–Ribière conjugate gradient method
- Truncated-Newton algorithms for large-scale unconstrained optimization
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- The convergence of conjugate gradient method with nonmonotone line search
- A modified conjugate gradient method based on a modified secant equation
- A new conjugate gradient algorithm with cubic Barzilai-Borwein stepsize for unconstrained optimization
- Inverse determination of a heat source from natural convection in a porous cavity
- Convergence of Liu-Storey conjugate gradient method
- Vectorization of conjugate-gradient methods for large-scale minimization in meteorology
- Convergence of PRP method with new nonmonotone line search
- A derivative-free scaling memoryless DFP method for solving large scale nonlinear monotone equations
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Application of scaled nonlinear conjugate-gradient algorithms to the inverse natural convection problem
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- A new derivative-free SCG-type projection method for nonlinear monotone equations with convex constraints
- A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization
- A modified Hestense-Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- An example of numerical nonconvergence of a variable-metric method
- A new family of conjugate gradient methods for unconstrained optimization
- An adaptive competitive penalty method for nonsmooth constrained optimization
- A three-term derivative-free projection method for nonlinear monotone system of equations
- On the limited memory BFGS method for large scale optimization
- A new family of conjugate gradient methods
- A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem
- New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization
- A conjugate gradient sampling method for nonsmooth optimization
- New quasi-Newton methods for unconstrained optimization problems
- On three-term conjugate gradient algorithms for unconstrained optimization
This page was built for publication: On the Convergence of a New Conjugate Gradient Algorithm
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4196255)