SCALCG
From MaRDI portal
Software:20462
swMATH8453MaRDI QIDQ20462FDOQ20462
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- A modified conjugate gradient method based on the self-scaling memoryless BFGS update
- The hybrid BFGS-CG method in solving unconstrained optimization problems
- A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
- Two classes of spectral conjugate gradient methods for unconstrained optimizations
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point
- A new hybrid PRPFR conjugate gradient method for solving nonlinear monotone equations and image restoration problems
- A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems
- Novel gradient‐based methods for heat flux retrieval
- An efficient adaptive scaling parameter for the spectral conjugate gradient method
- A new derivative-free SCG-type projection method for nonlinear monotone equations with convex constraints
- A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- An accelerated conjugate gradient algorithm for solving nonlinear monotone equations and image restoration problems
- A global convergence of LS-CD hybrid conjugate gradient method
- Comments on ”New hybrid conjugate gradient method as a convex combination of FR and PRP methods”
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function
- An improved nonmonotone adaptive trust region method.
- A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- An improved Perry conjugate gradient method with adaptive parameter choice
- Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
- A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem
- A descent hybrid modification of the Polak-Ribière-Polyak conjugate gradient method
- Optimal scaling parameters for spectral conjugate gradient methods
- A spectral three-term Hestenes-Stiefel conjugate gradient method
- A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problems
- Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
- A new family of conjugate gradient methods for unconstrained optimization
- Human motion estimation based on low dimensional space incremental learning
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
- Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization
- Comments on ``Hybrid conjugate gradient algorithm for unconstrained optimization
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- Nonlinear optimization applications using the GAMS technology
- On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems
- A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm
- A class of accelerated conjugate-gradient-like methods based on a modified secant equation
- New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method
- An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- Global convergence of a modified spectral three-term CG algorithm for nonconvex unconstrained optimization problems
- An efficient conjugate gradient method with strong convergence properties for non-smooth optimization
- A New Adaptive Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
- Global convergence of a modified Fletcher–Reeves conjugate gradient method with Wolfe line search
- A derivative-free conjugate gradient method and its global convergence for solving symmetric nonlinear equations
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- A variant spectral-type FR conjugate gradient method and its global convergence
- A new adaptive trust region algorithm for optimization problems
- A double parameter scaled BFGS method for unconstrained optimization
- A spectral conjugate gradient method for solving large-scale unconstrained optimization
- A method of two new augmented Lagrange multiplier versions for solving constrained problems
- New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization
- Applying powell's symmetrical technique to conjugate gradient methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- Two descent hybrid conjugate gradient methods for optimization
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- A modified scaling parameter for the memoryless BFGS updating formula
- Scaled conjugate gradient algorithms for unconstrained optimization
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- Two modified scaled nonlinear conjugate gradient methods
- A modified Perry conjugate gradient method and its global convergence
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Hybrid conjugate gradient algorithm for unconstrained optimization
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
- Another nonlinear conjugate gradient algorithm for unconstrained optimization
- A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Two new conjugate gradient methods based on modified secant equations
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A conjugate gradient method with sufficient descent property
- A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations
- A non-monotone line search algorithm for unconstrained optimization
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- On the sufficient descent property of the Shanno's conjugate gradient method
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- A modified scaled memoryless symmetric rank-one method
- A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- A new nonmonotone line search technique for unconstrained optimization
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Title not available (Why is that?)
- Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
This page was built for software: SCALCG