A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
From MaRDI portal
Publication:255074
DOI10.1007/s10957-015-0781-1zbMath1332.65081OpenAlexW955312132MaRDI QIDQ255074
Yong Li, Ze-Hong Meng, Gong Lin Yuan
Publication date: 9 March 2016
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-015-0781-1
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26)
Related Items (70)
A globally convergent projection method for a system of nonlinear monotone equations ⋮ A non-monotone pattern search approach for systems of nonlinear equations ⋮ An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property ⋮ An adaptive nonmonotone global Barzilai–Borwein gradient method for unconstrained optimization ⋮ A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A trust region spectral method for large-scale systems of nonlinear equations ⋮ A new adaptive trust region algorithm for optimization problems ⋮ A new trust region method for nonsmooth nonconvex optimization ⋮ A new filled function for global minimization and system of nonlinear equations ⋮ New hybrid conjugate gradient method as a convex combination of LS and FR methods ⋮ On the convergence of \(s\)-dependent GFR conjugate gradient method for unconstrained optimization ⋮ Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ The global convergence of a modified BFGS method for nonconvex functions ⋮ A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization ⋮ An inertial spectral CG projection method based on the memoryless BFGS update ⋮ An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method ⋮ An efficient conjugate gradient method with strong convergence properties for non-smooth optimization ⋮ Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions ⋮ An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization ⋮ A class of accelerated conjugate-gradient-like methods based on a modified secant equation ⋮ A modified four-term extension of the Dai-Liao conjugate gradient method ⋮ Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ A family of gradient methods using Householder transformation with application to hypergraph partitioning ⋮ A hybrid conjugate gradient algorithm for nonconvex functions and its applications in image restoration problems ⋮ A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems ⋮ A projection algorithm for pseudomonotone vector fields with convex constraints on Hadamard manifolds ⋮ A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem ⋮ Some modified Hestenes-Stiefel conjugate gradient algorithms with application in image restoration ⋮ A modified conjugate gradient method for general convex functions ⋮ A class of three-term derivative-free methods for large-scale nonlinear monotone system of equations and applications to image restoration problems ⋮ A derivative-free Liu-Storey method for solving large-scale nonlinear systems of equations ⋮ An improved three-term derivative-free method for solving nonlinear equations ⋮ An adaptive trust region algorithm for large-residual nonsmooth least squares problems ⋮ A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization ⋮ A new proximal Chebychev center cutting plane algorithm for nonsmooth optimization and its convergence ⋮ A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models ⋮ A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems ⋮ A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations ⋮ A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization ⋮ Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method ⋮ A quasi-Newton algorithm for large-scale nonlinear equations ⋮ Two nonparametric approaches to mean absolute deviation portfolio selection model ⋮ A modified three-term PRP conjugate gradient algorithm for optimization models ⋮ A new nonmonotone line-search trust-region approach for nonlinear systems ⋮ A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing ⋮ A Cauchy point direction trust region algorithm for nonlinear equations ⋮ A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems ⋮ A modified conjugate gradient method for monotone nonlinear equations with convex constraints ⋮ A new family of conjugate gradient methods for unconstrained optimization ⋮ A norm descent derivative-free algorithm for solving large-scale nonlinear symmetric equations ⋮ An effective adaptive trust region algorithm for nonsmooth minimization ⋮ Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search ⋮ A modified spectral PRP conjugate gradient projection method for solving large-scale monotone equations and its application in compressed sensing ⋮ A conjugate gradient algorithm and its applications in image restoration ⋮ A tensor trust-region model for nonlinear system ⋮ The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique ⋮ A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration ⋮ The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions ⋮ A limited memory BFGS subspace algorithm for bound constrained nonsmooth problems ⋮ A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization ⋮ A family of inertial-relaxed DFPM-based algorithms for solving large-scale monotone nonlinear equations with application to sparse signal restoration ⋮ A nonmonotone scaled Fletcher-Reeves conjugate gradient method with application in image reconstruction ⋮ A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions ⋮ A generalized geometric spectral conjugate gradient algorithm for finding zero of a monotone tangent vector field on a constant curvature Hadamard manifold ⋮ Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing ⋮ Optimal control of viscous Burgers equation via an adaptive nonmonotone Barzilai–Borwein gradient method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- A modified Fletcher-Reeves-type derivative-free method for symmetric nonlinear equations
- Practical quasi-Newton algorithms for singular nonlinear systems
- A BFGS trust-region method for nonlinear equations
- Limited memory BFGS method with backtracking for symmetric nonlinear equations
- Levenberg--Marquardt methods with strong local convergence properties for solving nonlinear equations with convex constraints
- A conjugate gradient method with descent direction for unconstrained optimization
- Subspace methods for large scale nonlinear equations and nonlinear least squares
- A truncated nonmonotone Gauss-Newton method for large-scale nonlinear least-squares problems
- The convergence properties of some new conjugate gradient methods
- Globally convergent limited memory bundle method for large-scale nonsmooth optimization
- Proximity control in bundle methods for convex nondifferentiable minimization
- Discrete gradient method: Derivative-free method for nonsmooth optimization
- A PRP type method for systems of monotone equations
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- BFGS trust-region method for symmetric nonlinear equations
- A modified PRP conjugate gradient method
- A new backtracking inexact BFGS method for symmetric nonlinear equations
- Efficient hybrid conjugate gradient techniques
- Convergence analysis of some methods for minimizing a nonsmooth convex function
- A general approach to convergence properties of some methods for nonsmooth convex optimization
- Tensor methods for large sparse systems of nonlinear equations
- A bundle-Newton method for nonsmooth unconstrained minimization
- Global convergence result for conjugate gradient methods
- Convergence of some algorithms for convex minimization
- A new method for nonsmooth convex optimization
- A new trust region method for nonlinear equations
- An implementation of Shor's \(r\)-algorithm
- On the convergence of a trust-region method for solving constrained nonlinear equations with degenerate solutions
- A trust region method for nonsmooth convex optimization
- A family of variable metric proximal methods
- Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Methods of descent for nondifferentiable optimization
- Spectral gradient projection method for monotone nonlinear equations with convex constraints
- A nonsmooth version of Newton's method
- Nonmonotone derivative-free methods for nonlinear equations
- A projection method for a system of nonlinear monotone equations with convex constraints
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Spectral gradient projection method for solving nonlinear monotone equations
- Comparison of formulations and solution methods for image restoration problems
- A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization
- THE BARZILAI AND BORWEIN GRADIENT METHOD WITH NONMONOTONE LINE SEARCH FOR NONSMOOTH CONVEX OPTIMIZATION PROBLEMS
- Comparing different nonsmooth minimization methods and software
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- A class of derivative-free methods for large-scale nonlinear monotone equations
- A descent algorithm for nonsmooth convex optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- A quasisecant method for minimizing nonsmooth functions
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Projected gradient methods for linearly constrained problems
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Version of the Bundle Idea for Minimizing a Nonsmooth Function: Conceptual Idea, Convergence Analysis, Numerical Results
- Monotone Operators and the Proximal Point Algorithm
- Atomic Decomposition by Basis Pursuit
- Trust Region Methods
- Nonmonotone Spectral Methods for Large-Scale Nonlinear Systems
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- A Globally and Superlinearly Convergent Gauss--Newton-Based BFGS Method for Symmetric Nonlinear Equations
- Convergence analysis of a proximal newton method1
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- Numerical Solution of Large Sets of Algebraic Nonlinear Equations
- Numerical Schubert Calculus by the Pieri Homotopy Algorithm
- A BFGS algorithm for solving symmetric nonlinear equations
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- New limited memory bundle method for large-scale nonsmooth optimization
- Spectral residual method without gradient information for solving large-scale nonlinear systems of equations
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations