Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
From MaRDI portal
Publication:2267641
DOI10.1016/j.ejor.2009.11.030zbMath1189.90151OpenAlexW2006468116MaRDI QIDQ2267641
Publication date: 1 March 2010
Published in: European Journal of Operational Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ejor.2009.11.030
Related Items (36)
A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm ⋮ Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ A new family of hybrid three-term conjugate gradient methods with applications in image restoration ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ A conjugate gradient sampling method for nonsmooth optimization ⋮ A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization ⋮ Two modified scaled nonlinear conjugate gradient methods ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ Spectral conjugate gradient methods for vector optimization problems ⋮ An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method ⋮ A modified scaled memoryless symmetric rank-one method ⋮ A Five-Parameter Class of Derivative-Free Spectral Conjugate Gradient Methods for Systems of Large-Scale Nonlinear Monotone Equations ⋮ A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem ⋮ On the sufficient descent property of the Shanno's conjugate gradient method ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization ⋮ A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei ⋮ A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function ⋮ The hybrid BFGS-CG method in solving unconstrained optimization problems ⋮ Extension of modified Polak-Ribière-Polyak conjugate gradient method to linear equality constraints minimization problems ⋮ A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length ⋮ New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization ⋮ A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique ⋮ A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control ⋮ Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization ⋮ An improved nonmonotone adaptive trust region method. ⋮ A modified spectral conjugate gradient method with global convergence ⋮ Scaled nonlinear conjugate gradient methods for nonlinear least squares problems ⋮ An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing ⋮ Global convergence of a modified spectral three-term CG algorithm for nonconvex unconstrained optimization problems ⋮ A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Conjugate gradient algorithms in nonconvex optimization
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Convergence properties of the Beale-Powell restart algorithm
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence of the method of shortest residuals
- A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Algorithm 851
- Preconditioning of Truncated-Newton Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Optimal conditioning of self-scaling variable Metric algorithms
- Some convergence properties of the conjugate gradient method
- Matrix conditioning and nonlinear optimization
- Restart procedures for the conjugate gradient method
- Conjugate Gradient Methods with Inexact Searches
- On the Convergence of a New Conjugate Gradient Algorithm
- On the convergence of conjugate gradient algorithms
- CUTE
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Conditioning of Quasi-Newton Methods for Function Minimization
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- New properties of a nonlinear conjugate gradient method
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization