Preconditioned nonlinear conjugate gradient methods based on a modified secant equation
From MaRDI portal
Publication:2422866
DOI10.1016/j.amc.2017.08.029zbMath1426.65081OpenAlexW2751136833MaRDI QIDQ2422866
Giovanni Fasano, Andrea Caliciotti, Massimo Roma
Publication date: 21 June 2019
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/10278/3692568
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Methods of reduced gradient type (90C52)
Related Items
On the convergence ball and error analysis of the modified secant method, Foreword to the special issue ``Recent trends in numerical computations: theory and algorithms, An Improvement of the Pivoting Strategy in the Bunch and Kaufman Decomposition, Within Truncated Newton Methods, Some modified Hestenes-Stiefel conjugate gradient algorithms with application in image restoration, On partial Cholesky factorization and a variant of quasi-Newton preconditioners for symmetric positive definite matrices, Joint-block-sparsity for efficient 2-D DOA estimation with multiple separable observations
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- CG_DESCENT
- Preconditioning Newton-Krylov methods in nonconvex large scale optimization
- Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods
- On the limited memory BFGS method for large scale optimization
- Conjugate gradient algorithms in nonconvex optimization
- A globally convergent version of the Polak-Ribière conjugate gradient method
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- A novel class of approximate inverse preconditioners for large positive definite linear systems in optimization
- A conjugate directions approach to improve the limited-memory BFGS method
- Robust Approximate Inverse Preconditioning for the Conjugate Gradient Method
- On A Class of Limited Memory Preconditioners For Large Scale Linear Systems With Multiple Right-Hand Sides
- A Preconditioner for A Primal-Dual Newton Conjugate Gradient Method for Compressed Sensing Problems
- Algorithm 851
- QN-like variable storage conjugate gradients
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- A Relationship between the BFGS and Conjugate Gradient Algorithms and Its Implications for New Algorithms
- Updating Quasi-Newton Matrices with Limited Storage
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Line search algorithms with guaranteed sufficient decrease
- Trust Region Methods
- Automatic Preconditioning by Limited Memory Quasi-Newton Updating
- On the Order of Convergence of Preconditioned Nonlinear Conjugate Gradient Methods
- A Preconditioning Framework for Sequences of Diagonally Modified Linear Systems Arising in Optimization
- A Matrix-Free Preconditioner for Sparse Symmetric Positive Definite Systems and Least-Squares Problems
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The Limited Memory Conjugate Gradient Method
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization