Accumulative approach in multistep diagonal gradient-type method for large-scale unconstrained optimization
From MaRDI portal
Publication:1760876
DOI10.1155/2012/875494zbMath1254.90226OpenAlexW2104669445WikidataQ58907531 ScholiaQ58907531MaRDI QIDQ1760876
Lihong Zheng, Wah June Leong, Mahboubeh Farid
Publication date: 15 November 2012
Published in: Journal of Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2012/875494
Related Items (2)
Scaled diagonal gradient-type method with extra update for large-scale unconstrained optimization ⋮ A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization
Uses Software
Cites Work
- Unnamed Item
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- A new two-step gradient-type method for large-scale unconstrained optimization
- An improved multi-step gradient-type method for large scale optimization
- A monotone gradient method via weak secant equation for unconstrained optimization
- A new gradient method via quasi-Cauchy relation which guarantees descent
- Alternating multi-step quasi-Newton methods for unconstrained optimization
- New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
- Minimization of functions having Lipschitz continuous first partial derivatives
- A New Newton's Method with Diagonal Jacobian Approximation for Systems of Nonlinear Equations
- Sizing and Least-Change Secant Methods
- Two-Point Step Size Gradient Methods
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Benchmarking optimization software with performance profiles.
This page was built for publication: Accumulative approach in multistep diagonal gradient-type method for large-scale unconstrained optimization