Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
From MaRDI portal
Publication:2189404
DOI10.1007/S11075-019-00787-7zbMATH Open1461.65190OpenAlexW2966950671MaRDI QIDQ2189404FDOQ2189404
Xiaoliang Wang, Zhou Sheng, Gonglin Yuan
Publication date: 15 June 2020
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-019-00787-7
Recommendations
- Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems
- Global convergence of a descent PRP type conjugate gradient method for nonconvex optimization
- A descent family of three-term conjugate gradient methods with global convergence for general functions
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Methods of quasi-Newton type (90C53)
Cites Work
- Algorithm 851
- Numerical Optimization
- Benchmarking optimization software with performance profiles.
- Title not available (Why is that?)
- Function minimization by conjugate gradients
- Title not available (Why is that?)
- An unconstrained optimization test functions collection
- Title not available (Why is that?)
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A method for the solution of certain non-linear problems in least squares
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Efficient generalized conjugate gradient algorithms. I: Theory
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- BFGS trust-region method for symmetric nonlinear equations
- A modified PRP conjugate gradient method
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- A new trust-region method with line search for solving symmetric nonlinear equations
- Convergence Properties of Algorithms for Nonlinear Optimization
- Convergence Properties of the BFGS Algoritm
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Title not available (Why is that?)
- \(n\)-step quadratic convergence of the MPRP method with a restart strategy
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of a modified BFGS method for nonconvex functions
- A new adaptive trust region algorithm for optimization problems
- An adaptive trust region algorithm for large-residual nonsmooth least squares problems
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- A novel parameter estimation method for muskingum model using new Newton-type trust region algorithm
- A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models
- An effective adaptive trust region algorithm for nonsmooth minimization
Cited In (16)
- Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems
- Title not available (Why is that?)
- A family of gradient methods using Householder transformation with application to hypergraph partitioning
- A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems
- A new hybrid PRPFR conjugate gradient method for solving nonlinear monotone equations and image restoration problems
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- A modified three-term type CD conjugate gradient algorithm for unconstrained optimization problems
- A smoothing projected HS method for solving stochastic tensor complementarity problem
- A new smoothing spectral conjugate gradient method for solving tensor complementarity problems
- Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization
- A Dai-Liao-type projection method for monotone nonlinear equations and signal processing
- A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems
- A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems
- Solving unconstrained optimization problems via hybrid CD-DY conjugate gradient methods with applications
- Modified spectral PRP conjugate gradient method for solving tensor eigenvalue complementarity problems
- A class of three-term derivative-free methods for large-scale nonlinear monotone system of equations and applications to image restoration problems
Uses Software
This page was built for publication: Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2189404)