Descent Perry conjugate gradient methods for systems of monotone nonlinear equations
DOI10.1007/s11075-019-00836-1zbMath1455.65085OpenAlexW3017234272MaRDI QIDQ2205641
Jamilu Sabi'u, Kabiru Ahmed Hungu, Mohammed Yusuf Waziri
Publication date: 21 October 2020
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-019-00836-1
global convergenceeigenvalue analysisnonlinear equationsconjugate gradient methodsmonotonicity propertyhyperplane projection
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical computation of solutions to systems of equations (65H10) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (16)
Uses Software
Cites Work
- An optimal parameter for Dai-Liao family of conjugate gradient methods
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- Globally convergent modified Perry's conjugate gradient method
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- A BFGS trust-region method for nonlinear equations
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- Levenberg--Marquardt methods with strong local convergence properties for solving nonlinear equations with convex constraints
- A norm descent derivative-free algorithm for solving large-scale nonlinear symmetric equations
- Subspace methods for large scale nonlinear equations and nonlinear least squares
- A truncated nonmonotone Gauss-Newton method for large-scale nonlinear least-squares problems
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- A globally convergent derivative-free method for solving large-scale nonlinear monotone equations
- A PRP type method for systems of monotone equations
- Two new conjugate gradient methods based on modified secant equations
- Tensor methods for large sparse systems of nonlinear equations
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- A new trust region method for nonlinear equations
- Globally convergent three-term conjugate gradient projection methods for solving nonlinear monotone equations
- A PRP-based residual method for large-scale monotone nonlinear equations
- A derivative-free conjugate gradient method and its global convergence for solving symmetric nonlinear equations
- On the convergence of a trust-region method for solving constrained nonlinear equations with degenerate solutions
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization
- A projection method for convex constrained monotone nonlinear equations with applications
- A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing
- Convergence properties of an iterative method for solving symmetric non-linear equations
- A derivative-free method for solving large-scale nonlinear systems of equations
- A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- A descent extension of the Polak-Ribière-Polyak conjugate gradient method
- A family of conjugate gradient methods for large-scale nonlinear equations
- A scaled derivative-free projection method for solving nonlinear monotone equations
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Monotonicity of Fixed Point and Normal Mappings Associated with Variational Inequality and Its Application
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Two optimal Dai–Liao conjugate gradient methods
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- Technical Note—A Modified Conjugate Gradient Algorithm
- Numerical Optimization
- Newton-type Methods with Generalized Distances For Constrained Optimization
- An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions
- A derivative-free three-term projection algorithm involving spectral quotient for solving nonlinear monotone equations
- A Globally and Superlinearly Convergent Gauss--Newton-Based BFGS Method for Symmetric Nonlinear Equations
- A Nonmonotone Line Search Technique for Newton’s Method
- A Class of Methods for Solving Nonlinear Simultaneous Equations
- A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function
- The conjugate gradient method in extremal problems
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- A method for the solution of certain non-linear problems in least squares
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Descent Perry conjugate gradient methods for systems of monotone nonlinear equations