A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
From MaRDI portal
Publication:295480
DOI10.1007/S12190-015-0912-8zbMATH Open1352.65159OpenAlexW1113115316MaRDI QIDQ295480FDOQ295480
Publication date: 13 June 2016
Published in: Journal of Applied Mathematics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s12190-015-0912-8
conjugate gradientalgorithmglobal convergencenumerical resultnonmonotone techniquenonsmooth convex minimization
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Algorithm 851
- New limited memory bundle method for large-scale nonsmooth optimization
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The conjugate gradient method in extremal problems
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Efficient hybrid conjugate gradient techniques
- A general approach to convergence properties of some methods for nonsmooth convex optimization
- A bundle-Newton method for nonsmooth unconstrained minimization
- Convergence of some algorithms for convex minimization
- A family of variable metric proximal methods
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Methods of descent for nondifferentiable optimization
- A nonsmooth version of Newton's method
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Projected gradient methods for linearly constrained problems
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Version of the Bundle Idea for Minimizing a Nonsmooth Function: Conceptual Idea, Convergence Analysis, Numerical Results
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Nonmonotone Line Search Technique for Newton’s Method
- Proximity control in bundle methods for convex nondifferentiable minimization
- New quasi-Newton methods for unconstrained optimization problems
- A limited memory BFGS-type method for large-scale unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- Nonmonotone line search for minimax problems
- Global convergence analysis of a new nonmonotone BFGS algorithm on convex objective functions
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- Convergence analysis of a modified BFGS method on convex minimizations
Cited In (7)
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A modified PRP conjugate gradient method for unconstrained optimization and nonlinear equations
- A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization
- Fully complex conjugate gradient-based neural networks using Wirtinger calculus framework: deterministic convergence and its application
- A modified PRP-type conjugate gradient projection algorithm for solving large-scale monotone nonlinear equations with convex constraint
Uses Software
This page was built for publication: A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q295480)