Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
From MaRDI portal
Publication:2189404
DOI10.1007/s11075-019-00787-7zbMath1461.65190OpenAlexW2966950671MaRDI QIDQ2189404
Xiaoliang Wang, Zhou Sheng, Gong Lin Yuan
Publication date: 15 June 2020
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-019-00787-7
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Methods of quasi-Newton type (90C53)
Related Items (16)
A new smoothing spectral conjugate gradient method for solving tensor complementarity problems ⋮ A smoothing projected HS method for solving stochastic tensor complementarity problem ⋮ A Dai-Liao-type projection method for monotone nonlinear equations and signal processing ⋮ A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems ⋮ A family of gradient methods using Householder transformation with application to hypergraph partitioning ⋮ A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems ⋮ Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization ⋮ A class of three-term derivative-free methods for large-scale nonlinear monotone system of equations and applications to image restoration problems ⋮ Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems ⋮ A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems ⋮ A new hybrid PRPFR conjugate gradient method for solving nonlinear monotone equations and image restoration problems ⋮ Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization ⋮ A modified three-term type CD conjugate gradient algorithm for unconstrained optimization problems ⋮ Unnamed Item ⋮ Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications ⋮ Modified spectral PRP conjugate gradient method for solving tensor eigenvalue complementarity problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- BFGS trust-region method for symmetric nonlinear equations
- A modified PRP conjugate gradient method
- A globally convergent version of the Polak-Ribière conjugate gradient method
- A new adaptive trust region algorithm for optimization problems
- An adaptive trust region algorithm for large-residual nonsmooth least squares problems
- A novel parameter estimation method for muskingum model using new Newton-type trust region algorithm
- A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models
- An effective adaptive trust region algorithm for nonsmooth minimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- The global convergence of a modified BFGS method for nonconvex functions
- Convergence Properties of Algorithms for Nonlinear Optimization
- A new trust-region method with line search for solving symmetric nonlinear equations
- Algorithm 851
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Numerical Optimization
- Convergence Properties of the BFGS Algoritm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A method for the solution of certain non-linear problems in least squares
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
This page was built for publication: Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions