A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
From MaRDI portal
Publication:5300528
Recommendations
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- Nonlinear conjugate gradient methods with Wolfe type line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A new globally convergent conjugate gradient method with Wolfe line search
- Global convergence of a conjugate gradient method with strong Wolfe-Powell line search
Cited in
(only showing first 100 items - show all)- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters
- A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
- Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
- A modified spectral conjugate gradient method with global convergence
- A modified conjugate gradient method for general convex functions
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations
- A new conjugate gradient method with an efficient memory structure
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
- Theoretical characteristics and numerical methods for a class of special piecewise quadratic optimization
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems
- A class of accelerated subspace minimization conjugate gradient methods
- A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
- A new self-scaling memoryless quasi-Newton update for unconstrained optimization
- An efficient inertial subspace minimization CG algorithm with convergence rate analysis for constrained nonlinear monotone equations
- An accelerated relaxed-inertial strategy based CGP algorithm with restart technique for constrained nonlinear pseudo-monotone equations to image de-blurring problems
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- Three modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme
- A simulated annealing-based Barzilai-Borwein gradient method for unconstrained optimization problems
- A mini-batch stochastic conjugate gradient algorithm with variance reduction
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization
- A modified spectral conjugate gradient method for solving unconstrained minimization problems
- A novel value for the parameter in the Dai-Liao-type conjugate gradient method
- A new subspace minimization conjugate gradient method for unconstrained minimization
- Two adaptive Dai-Liao nonlinear conjugate gradient methods
- A modified conjugate gradient method based on a modified secant equation
- Two optimal Dai-Liao conjugate gradient methods
- A modified Hestenes-Stiefel conjugate gradient method with an optimal property
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations
- Another Hager-Zhang-type method via singular-value study for constrained monotone equations with application
- A family of accelerated hybrid conjugate gradient method for unconstrained optimization and image restoration
- A conjugate gradient algorithm without Lipchitz continuity and its applications
- A global convergence of LS-CD hybrid conjugate gradient method
- On two symmetric Dai-Kou type schemes for constrained monotone equations with image recovery application
- Total variation superiorized conjugate gradient method for image reconstruction
- scientific article; zbMATH DE number 3307155 (Why is no real title available?)
- The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- Matrix analyses on the Dai-Liao conjugate gradient method
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- Nonlinear conjugate gradient for smooth convex functions
- Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
- An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A modified scaling parameter for the memoryless BFGS updating formula
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- A New Dai-Liao Conjugate Gradient Method based on Approximately Optimal Stepsize for Unconstrained Optimization
- A family of spectral conjugate gradient methods with strong convergence and its applications in image restoration and machine learning
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems
- An inertial spectral CG projection method based on the memoryless BFGS update
- Some new three-term Hestenes-Stiefel conjugate gradient methods with affine combination
- A family of three-term conjugate gradient projection methods with a restart procedure and their relaxed-inertial extensions for the constrained nonlinear pseudo-monotone equations with applications
- A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems
- A new approximate descent derivative-free algorithm for large-scale nonlinear symmetric equations
- A new efficient conjugate gradient method for unconstrained optimization
- Two improved nonlinear conjugate gradient methods with the strong Wolfe line search
- A new class of efficient and globally convergent conjugate gradient methods in the Dai-Liao family
- An efficient modified residual-based algorithm for large scale symmetric nonlinear equations by approximating successive iterated gradients
- A descent family of Dai-Liao conjugate gradient methods
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- New gradient methods with adaptive stepsizes by approximate models
- A descent Dai-Liao projection method for convex constrained nonlinear monotone equations with applications
- On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization
- An improved Perry conjugate gradient method with adaptive parameter choice
- Spectral conjugate gradient methods for vector optimization problems
- An improved PRP type spectral conjugate gradient method with restart steps
- A modified nonlinear Polak-Ribière-Polyak conjugate gradient method with sufficient descent property
- A conjugate gradient method with sufficient descent property
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A new structured spectral conjugate gradient method for nonlinear least squares problems
- A globally convergent hybrid conjugate gradient method and its numerical behaviors
- An efficient adaptive three-term extension of the Hestenes-Stiefel conjugate gradient method
- A spectral three-term Hestenes-Stiefel conjugate gradient method
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problems
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- A nonmonotone hybrid method of conjugate gradient and Lanczos-type for solving nonlinear systems
- Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
This page was built for publication: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5300528)