A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
From MaRDI portal
Publication:6141538
DOI10.1007/s11075-023-01559-0arXiv2301.02863MaRDI QIDQ6141538
No author found.
Publication date: 19 December 2023
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2301.02863
orthogonalityquasi-Newton methodlimited memoryregularization modelsubspace minimization conjugate gradient method
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Cites Work
- Unnamed Item
- Unnamed Item
- A Barzilai-Borwein conjugate gradient method
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- Convergence properties of the regularized Newton method for the unconstrained nonconvex optimization
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems
- On the limited memory BFGS method for large scale optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations
- A class of accelerated subspace minimization conjugate gradient methods
- A regularized limited memory BFGS method for nonconvex unconstrained minimization
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- A Modified BFGS Algorithm for Unconstrained Optimization
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Updating Quasi-Newton Matrices with Limited Storage
- Numerical Optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Globally and Superlinearly Convergent Gauss--Newton-Based BFGS Method for Symmetric Nonlinear Equations
- Descent Directions of Quasi-Newton Methods for Symmetric Nonlinear Equations
- A Subspace Study on Conjugate Gradient Algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A new nonmonotone line search technique for unconstrained optimization
- Benchmarking optimization software with performance profiles.