A conjugate gradient type method for the nonnegative constraints optimization problems
From MaRDI portal
Publication:2375742
DOI10.1155/2013/986317zbMath1266.65101OpenAlexW1966607107WikidataQ59007367 ScholiaQ59007367MaRDI QIDQ2375742
Publication date: 14 June 2013
Published in: Journal of Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2013/986317
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A modified Fletcher-Reeves-type derivative-free method for symmetric nonlinear equations
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Conjugate gradient method for the linear complementarity problem withs-matrix
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Convergence Properties of Algorithms for Nonlinear Optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the convergence of conjugate gradient algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- An efficient hybrid conjugate gradient method for unconstrained optimization
This page was built for publication: A conjugate gradient type method for the nonnegative constraints optimization problems