The Limited Memory Conjugate Gradient Method
From MaRDI portal
Publication:5408214
DOI10.1137/120898097zbMath1298.90129OpenAlexW2008374458MaRDI QIDQ5408214
Hongchao Zhang, William W. Hager
Publication date: 9 April 2014
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/856a6b243a9b345bf298ed89380e433fa6d8d22e
unconstrained optimizationconjugate gradient methodnonlinear conjugate gradientsadaptive methodL-BFGSlimited memoryBFGSreduced Hessian methodlimited memory BFGSCG\_DESCENTL-RHR
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Methods of reduced gradient type (90C52)
Related Items
A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization, Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property, An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition, A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems, On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients, Speeding up the convergence of the Polyak's heavy ball algorithm, An improved Perry conjugate gradient method with adaptive parameter choice, An active set trust-region method for bound-constrained optimization, Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search, Symmetric Perry conjugate gradient method, An optimal subgradient algorithm for large-scale bound-constrained convex optimization, Further comment on another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei, A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization, A decent three term conjugate gradient method with global convergence properties for large scale unconstrained optimization problems, A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization, New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method, Preconditioned nonlinear conjugate gradient methods based on a modified secant equation, Alternating cyclic vector extrapolation technique for accelerating nonlinear optimization algorithms and fixed-point mapping applications, A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization, A robust BFGS algorithm for unconstrained nonlinear optimization problems, An overview of nonlinear optimization, A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization, A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization, Inexact Newton-type methods based on Lanczos orthonormal method and application for full waveform inversion, A novel fractional Tikhonov regularization coupled with an improved super-memory gradient method and application to dynamic force identification problems, Sufficient descent conjugate gradient methods for solving convex constrained nonlinear monotone equations, An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization, An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method, A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice, CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method, Higher order curvature information and its application in a modified diagonal Secant method, An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction, On exact linesearch quasi-Newton methods for minimizing a quadratic function, Exploiting damped techniques for nonlinear conjugate gradient methods, A limited memory descent Perry conjugate gradient method, Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods, Planar methods and grossone for the conjugate gradient breakdown in nonlinear programming, An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization, A novel method of dynamic force identification and its application, A convexity enforcing \(C^0\) interior penalty method for the Monge-Ampère equation on convex polygonal domains, An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization, A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization, Truncated trust region method for nonlinear inverse problems and application in full-waveform inversion, A subspace minimization conjugate gradient method based on conic model for unconstrained optimization, Unnamed Item, A three-term conjugate gradient method with accelerated subspace quadratic optimization, On the connection between the conjugate gradient method and quasi-Newton methods on quadratic problems, A one-parameter class of three-term conjugate gradient methods with an adaptive parameter choice, Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems, A modified Perry conjugate gradient method and its global convergence
Uses Software