A new accelerated conjugate gradient method for large-scale unconstrained optimization
From MaRDI portal
Publication:2068094
DOI10.1186/s13660-019-2238-9zbMath1499.90216OpenAlexW2990264654WikidataQ126775599 ScholiaQ126775599MaRDI QIDQ2068094
Mingyuan Cao, Yue-Ting Yang, Yu-Ting Chen
Publication date: 19 January 2022
Published in: Journal of Inequalities and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1186/s13660-019-2238-9
global convergenceconjugate gradientlarge-scale unconstrained optimizationdescent conditionDai-Liao conjugacy condition
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Methods of reduced gradient type (90C52)
Related Items
A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model, A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A short note on the global convergence of the unmodified PRP method
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- An improved nonlinear conjugate gradient method with an optimal property
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- Comments on a new class of nonlinear conjugate gradient coefficients with global convergence properties
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Two new conjugate gradient methods based on modified secant equations
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- The number and stability of limit cycles for planar piecewise linear systems of node-saddle type
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- On the convergence of the modified Levenberg-Marquardt method with a nonmonotone second order Armijo type line search
- On the convergence of a modified regularized Newton method for convex optimization with singular solutions
- An efficient two grid method for miscible displacement problem approximated by mixed finite element methods
- Robust parameter identification using parallel global optimization for a batch nonlinear enzyme-catalytic time-delayed process presenting metabolic discontinuities
- Discontinuity-induced limit cycles in a general planar piecewise linear system of saddle-focus type
- Comments on ``Another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei
- Robust multi-objective optimal switching control arising in 1,3-propanediol microbial fed-batch process
- On the basins of attraction for a class of delay differential equations with non-monotone bistable nonlinearities
- Further studies on the Wei-Yao-Liu nonlinear conjugate gradient method
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Two optimal Dai–Liao conjugate gradient methods
- Technical Note—A Modified Conjugate Gradient Algorithm
- Comments on “A hybrid conjugate gradient method based on a quadratic relaxation of the Dai-Yuan hybrid conjugate gradient parameter”
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- An Inexact PRP Conjugate Gradient Method for Symmetric Nonlinear Equations
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The conjugate gradient method in extremal problems
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization