A modified scaled memoryless symmetric rank-one method
From MaRDI portal
Publication:2193423
DOI10.1007/S40574-020-00231-YzbMath1448.90100OpenAlexW3039497268MaRDI QIDQ2193423
Publication date: 25 August 2020
Published in: Bollettino dell'Unione Matematica Italiana (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s40574-020-00231-y
unconstrained optimizationeigenvaluelarge-scale optimizationcondition numbersymmetric rank-one updatememoryless quasi-Newton method
Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Two new conjugate gradient methods based on modified secant equations
- Convergence of quasi-Newton matrices generated by the symmetric rank one update
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- Two-Point Step Size Gradient Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Approximation of Sequences of Symmetric Matrices with the Symmetric Rank-One Algorithm and Applications
- CUTEr and SifDec
- Benchmarking optimization software with performance profiles.
- A survey of quasi-Newton equations and quasi-Newton methods for optimization
This page was built for publication: A modified scaled memoryless symmetric rank-one method