Scaled memoryless symmetric rank one method for large-scale optimization
From MaRDI portal
Publication:720629
DOI10.1016/j.amc.2011.05.080zbMath1226.65054OpenAlexW2045176326MaRDI QIDQ720629
Wah June Leong, Malik Abu Hassan
Publication date: 11 October 2011
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: http://psasir.upm.edu.my/id/eprint/24641/1/Scaled%20memoryless%20symmetric%20rank%20one%20method%20for%20large.pdf
unconstrained optimizationnumerical resultslarge-scale optimizationoptimal scalingmemoryless quasi-Newton methodsymmetric rank one update
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26)
Related Items
Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization ⋮ A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization
Uses Software
Cites Work
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- A restarting approach for the symmetric rank one update for unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Convergence of quasi-Newton matrices generated by the symmetric rank one update
- A new gradient method via least change secant update
- Sizing and Least-Change Secant Methods
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Optimal conditioning of self-scaling variable Metric algorithms
- Matrix conditioning and nonlinear optimization
- Conjugate Gradient Methods with Inexact Searches
- CUTE
- Line search algorithms with guaranteed sufficient decrease
- A Theoretical and Experimental Study of the Symmetric Rank-One Update
- Measures for Symmetric Rank-One Updates
- Convergence Conditions for Ascent Methods