Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
From MaRDI portal
(Redirected from Publication:2009059)
Recommendations
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- A hybrid scaling parameter for the scaled memoryless BFGS method based on the \(\ell_\infty\) matrix norm
- Scaled memoryless BFGS preconditioned steepest descent method for very large-scale unconstrained optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A double parameter scaled BFGS method for unconstrained optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- A modified scaling BFGS method for nonconvex minimization
- A modified scaling parameter for the memoryless BFGS updating formula
- On the limited memory BFGS method for large scale optimization
Cites work
- scientific article; zbMATH DE number 1805736 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A Nonmonotone Line Search Technique for Newton’s Method
- A double parameter scaled BFGS method for unconstrained optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- A modified scaled conjugate gradient method with global convergence for nonconvex functions
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- A modified scaling parameter for the memoryless BFGS updating formula
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A note on performance profiles for benchmarking software
- A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A spectral conjugate gradient method for unconstrained optimization
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Algorithm 851
- An adaptive scaled BFGS method for unconstrained optimization
- Another hybrid conjugate gradient algorithm for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- CUTEr and SifDec
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Modifying the BFGS method
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Optimal conditioning of self-scaling variable Metric algorithms
- Optimization theory and methods. Nonlinear programming
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- Two modified scaled nonlinear conjugate gradient methods
Cited in
(13)- A hybrid quasi-Newton method with application in sparse recovery
- A new self-scaling memoryless quasi-Newton update for unconstrained optimization
- A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM
- A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem
- A double parameter scaled BFGS method for unconstrained optimization
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- A nonmonotone adaptive trust region technique with a forgetting factor
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- A restart scheme for the memoryless BFGS method
- A unified derivative-free projection method model for large-scale nonlinear equations with convex constraints
- A new version of augmented self-scaling BFGS method
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model
This page was built for publication: Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2009059)