Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization
From MaRDI portal
Publication:6110199
DOI10.1007/s11075-022-01440-6OpenAlexW4382654973MaRDI QIDQ6110199
Publication date: 4 July 2023
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-022-01440-6
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- Analysis of a self-scaling quasi-Newton method
- The convergence properties of some new conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A globally convergent and efficient method for unconstrained discrete-time optimal control
- Global convergence of the Fletcher-Reeves algorithm with inexact linesearch
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- A limited memory \(q\)-BFGS algorithm for unconstrained optimization problems
- Adaptive scaling damped BFGS method without gradient Lipschitz continuity
- A \(q\)-Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization problems
- Modified spectral PRP conjugate gradient method for solving tensor eigenvalue complementarity problems
- The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems
- On \(q\)-BFGS algorithm for unconstrained optimization problems
- Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions
- A spectral conjugate gradient method for solving large-scale unconstrained optimization
- An improved Polak-Ribière-Polyak conjugate gradient method with an efficient restart direction
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Some three-term conjugate gradient methods with the new direction structure
- A descent extension of the Polak-Ribière-Polyak conjugate gradient method
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter
- Smoothing Nonlinear Conjugate Gradient Method for Image Restoration Using Nonsmooth Nonconvex Minimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence properties of the Fletcher-Reeves method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A self-correcting variable-metric algorithm framework for nonsmooth optimization
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- An Optimal Extension of the Polak–Ribière–Polyak Conjugate Gradient Method
- Convergence Conditions for Ascent Methods
- The conjugate gradient method in extremal problems
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.