A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
From MaRDI portal
Publication:295480
DOI10.1007/s12190-015-0912-8zbMath1352.65159OpenAlexW1113115316MaRDI QIDQ295480
Publication date: 13 June 2016
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12190-015-0912-8
algorithmglobal convergenceconjugate gradientnumerical resultnonmonotone techniquenonsmooth convex minimization
Related Items
Fully complex conjugate gradient-based neural networks using Wirtinger calculus framework: deterministic convergence and its application ⋮ A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization ⋮ A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convergence analysis of a modified BFGS method on convex minimizations
- Proximity control in bundle methods for convex nondifferentiable minimization
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A limited memory BFGS-type method for large-scale unconstrained optimization
- Efficient hybrid conjugate gradient techniques
- A general approach to convergence properties of some methods for nonsmooth convex optimization
- A bundle-Newton method for nonsmooth unconstrained minimization
- New quasi-Newton equation and related methods for unconstrained optimization
- Nonmonotone line search for minimax problems
- Convergence of some algorithms for convex minimization
- Global convergence analysis of a new nonmonotone BFGS algorithm on convex objective functions
- A globally convergent version of the Polak-Ribière conjugate gradient method
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- A family of variable metric proximal methods
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Methods of descent for nondifferentiable optimization
- A nonsmooth version of Newton's method
- New quasi-Newton methods for unconstrained optimization problems
- Convergence Properties of Algorithms for Nonlinear Optimization
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Projected gradient methods for linearly constrained problems
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Version of the Bundle Idea for Minimizing a Nonsmooth Function: Conceptual Idea, Convergence Analysis, Numerical Results
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Nonmonotone Line Search Technique for Newton’s Method
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- New limited memory bundle method for large-scale nonsmooth optimization
- The conjugate gradient method in extremal problems