An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
From MaRDI portal
Recommendations
- A scaled three-term conjugate gradient method for unconstrained optimization
- A modified conjugate gradient method based on the self-scaling memoryless BFGS update
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problems
- Three-term modified LS conjugate gradient method with its global convergence
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A descent family of Dai-Liao conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- A new approach to variable metric algorithms
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- Adaptive restart for accelerated gradient schemes
- An improved nonlinear conjugate gradient method with an optimal property
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- An unconstrained optimization test functions collection
- Benchmarking optimization software with performance profiles.
- Conditioning of Quasi-Newton Methods for Function Minimization
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- Function minimization by conjugate gradients
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Methods of conjugate gradients for solving linear systems
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- New conjugacy conditions and related nonlinear conjugate gradient methods
- New quasi-Newton methods for unconstrained optimization problems
- On the Convergence of a New Conjugate Gradient Algorithm
- On three-term conjugate gradient algorithms for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Some methods of speeding up the convergence of iteration methods
- Subgradient methods for huge-scale optimization problems
- Technical Note—A Modified Conjugate Gradient Algorithm
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- The conjugate gradient method in extremal problems
- Two optimal Dai-Liao conjugate gradient methods
Cited in
(13)- A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice
- A new accelerated conjugate gradient method for large-scale unconstrained optimization
- Signal recovery with constrained monotone nonlinear equations through an effective three-term conjugate gradient method
- An accelerated conjugate gradient method with adaptive two-parameter with applications in image restoration
- Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization
- A class of globally convergent three-term Dai-Liao conjugate gradient methods
- Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
- An adaptive family of projection methods for constrained monotone nonlinear equations with applications
- A modified conjugate gradient method based on the self-scaling memoryless BFGS update
- A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model
- A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization
- Some three-term conjugate gradient methods with the new direction structure
- A scaled three-term conjugate gradient method for unconstrained optimization
This page was built for publication: An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1677473)