A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization
DOI10.1007/S10957-020-01636-7zbMATH Open1441.90159OpenAlexW3006347222MaRDI QIDQ1985287FDOQ1985287
Authors: Tsegay Giday Woldu, Hai-Bin Zhang, Xin Zhang, Yemane Hailu Fissuh
Publication date: 7 April 2020
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-020-01636-7
Recommendations
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A conjugate gradient method for solving large-scale nonsmooth minimizations
- scientific article; zbMATH DE number 7028651
global convergenceconjugate gradient methodMoreau-Yosida regularizationnonsmooth large-scale problems
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Cites Work
- New limited memory bundle method for large-scale nonsmooth optimization
- Atomic Decomposition by Basis Pursuit
- Benchmarking optimization software with performance profiles.
- Function minimization by conjugate gradients
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Efficient generalized conjugate gradient algorithms. I: Theory
- Convergence analysis of some methods for minimizing a nonsmooth convex function
- A bundle-Newton method for nonsmooth unconstrained minimization
- Convergence of some algorithms for convex minimization
- A trust region method for nonsmooth convex optimization
- A family of variable metric proximal methods
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Methods of descent for nondifferentiable optimization
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- Comparison of formulations and solution methods for image restoration problems
- The Barzilai and Borwein gradient method with nonmonotone line search for nonsmooth convex optimization problems
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- A descent algorithm for nonsmooth convex optimization
- Optimization and nonsmooth analysis
- Projected gradient methods for linearly constrained problems
- Title not available (Why is that?)
- Monotone Operators and the Proximal Point Algorithm
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Convergence analysis of a proximal newton method1
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- Title not available (Why is that?)
- Proximity control in bundle methods for convex nondifferentiable minimization
- Title not available (Why is that?)
- Survey of Bundle Methods for Nonsmooth Optimization
- A survey of nonlinear conjugate gradient methods
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Title not available (Why is that?)
- Two modified hybrid conjugate gradient methods based on a hybrid secant equation
- The smoothing FR conjugate gradient method for solving a kind of nonsmooth optimization problem with \(l_1\)-norm
- A new smoothing modified three-term conjugate gradient method for \(l_1\)-norm minimization problem
Cited In (9)
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations
- Title not available (Why is that?)
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A modified PRP conjugate gradient method for unconstrained optimization and nonlinear equations
- A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- Title not available (Why is that?)
- An efficient conjugate gradient method with strong convergence properties for non-smooth optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
Uses Software
This page was built for publication: A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1985287)