Scaled memoryless symmetric rank one method for large-scale optimization
DOI10.1016/J.AMC.2011.05.080zbMATH Open1226.65054OpenAlexW2045176326MaRDI QIDQ720629FDOQ720629
Authors: W. J. Leong, M. A. Hassan
Publication date: 11 October 2011
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: http://psasir.upm.edu.my/id/eprint/24641/1/Scaled%20memoryless%20symmetric%20rank%20one%20method%20for%20large.pdf
Recommendations
- Positive-definite memoryless symmetric rank one method for large-scale unconstrained optimization
- scientific article; zbMATH DE number 5630374
- Updating the self-scaling symmetric rank one algorithm with limited memory for large-scale unconstrained optimization
- A modified scaled memoryless symmetric rank-one method
- A memoryless symmetric rank-one method with sufficient descent property for unconstrained optimization
- Structured symmetric rank-one method for unconstrained optimization
- Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization
- Scaled memoryless BFGS preconditioned steepest descent method for very large-scale unconstrained optimization
- An approach to scaling symmetric rank-one update
- Non-Euclidean restricted memory level method for large-scale convex optimization
numerical resultsoptimal scalingunconstrained optimizationlarge-scale optimizationmemoryless quasi-Newton methodsymmetric rank one update
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26)
Cites Work
- Testing Unconstrained Optimization Software
- CUTE
- On the limited memory BFGS method for large scale optimization
- Line search algorithms with guaranteed sufficient decrease
- Convergence Conditions for Ascent Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence of quasi-Newton matrices generated by the symmetric rank one update
- Sizing and Least-Change Secant Methods
- Matrix conditioning and nonlinear optimization
- Improved Hessian approximation with modified secant equations for symmetric rank-one method
- Optimal conditioning of self-scaling variable Metric algorithms
- Conjugate Gradient Methods with Inexact Searches
- A Theoretical and Experimental Study of the Symmetric Rank-One Update
- A restarting approach for the symmetric rank one update for unconstrained optimization
- Measures for Symmetric Rank-One Updates
- A new gradient method via least change secant update
Cited In (8)
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- Title not available (Why is that?)
- Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization
- A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization
- Positive-definite memoryless symmetric rank one method for large-scale unconstrained optimization
- A memoryless symmetric rank-one method with sufficient descent property for unconstrained optimization
- A modified scaled memoryless symmetric rank-one method
- Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization
Uses Software
This page was built for publication: Scaled memoryless symmetric rank one method for large-scale optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q720629)