A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization
DOI10.1007/s10957-020-01636-7zbMath1441.90159OpenAlexW3006347222MaRDI QIDQ1985287
Hai-Bin Zhang, Yemane Hailu Fissuh, Tsegay Giday Woldu, Xin Zhang
Publication date: 7 April 2020
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-020-01636-7
global convergenceconjugate gradient methodMoreau-Yosida regularizationnonsmooth large-scale problems
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- A new smoothing modified three-term conjugate gradient method for \(l_1\)-norm minimization problem
- Proximity control in bundle methods for convex nondifferentiable minimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Convergence analysis of some methods for minimizing a nonsmooth convex function
- A bundle-Newton method for nonsmooth unconstrained minimization
- Convergence of some algorithms for convex minimization
- The smoothing FR conjugate gradient method for solving a kind of nonsmooth optimization problem with \(l_1\)-norm
- A trust region method for nonsmooth convex optimization
- A family of variable metric proximal methods
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Methods of descent for nondifferentiable optimization
- Comparison of formulations and solution methods for image restoration problems
- THE BARZILAI AND BORWEIN GRADIENT METHOD WITH NONMONOTONE LINE SEARCH FOR NONSMOOTH CONVEX OPTIMIZATION PROBLEMS
- A descent algorithm for nonsmooth convex optimization
- Optimization and nonsmooth analysis
- Projected gradient methods for linearly constrained problems
- Monotone Operators and the Proximal Point Algorithm
- Atomic Decomposition by Basis Pursuit
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Convergence analysis of a proximal newton method1
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- Survey of Bundle Methods for Nonsmooth Optimization
- TWO MODIFIED HYBRID CONJUGATE GRADIENT METHODS BASED ON A HYBRID SECANT EQUATION
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- New limited memory bundle method for large-scale nonsmooth optimization
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization