Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
From MaRDI portal
Publication:2009059
DOI10.1007/s11075-019-00658-1zbMath1433.90157OpenAlexW2910211331MaRDI QIDQ2009059
Zohre Aminifard, Saman Babaie-Kafaki
Publication date: 27 November 2019
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-019-00658-1
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items (7)
A unified derivative-free projection method model for large-scale nonlinear equations with convex constraints ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM ⋮ A restart scheme for the memoryless BFGS method ⋮ Unnamed Item ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A modified scaling parameter for the memoryless BFGS updating formula
- Two modified scaled nonlinear conjugate gradient methods
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- A modified scaled conjugate gradient method with global convergence for nonconvex functions
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- An adaptive scaled BFGS method for unconstrained optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Modifying the BFGS method
- A double parameter scaled BFGS method for unconstrained optimization
- A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Optimal conditioning of self-scaling variable Metric algorithms
- A Nonmonotone Line Search Technique for Newton’s Method
- A Note on Performance Profiles for Benchmarking Software
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- CUTEr and SifDec
- A modified BFGS method and its global convergence in nonconvex minimization
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length