A modified BFGS algorithm based on a hybrid secant equation
From MaRDI portal
Recommendations
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- Using a modied secant equation for unconstrained optimization
- Two modified hybrid conjugate gradient methods based on a hybrid secant equation
- A class of modified BFGS algorithm with superlinear convergence based on new quasi-Newton equation
- Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
Cites work
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 766480 (Why is no real title available?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A Modified BFGS Algorithm for Unconstrained Optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule
- A modified quasi-Newton method for structured optimization with partial information on the Hessian
- A new approach to variable metric algorithms
- A new structured quasi-Newton algorithm using partial information on Hessian
- Benchmarking optimization software with performance profiles.
- CUTEr and SifDec
- Conditioning of Quasi-Newton Methods for Function Minimization
- Conic Approximations and Collinear Scalings for Optimizers
- Convergence theory for the structured BFGS secant method with an application to nonlinear least squares
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Multi-step quasi-Newton methods for optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- Optimization theory and methods. Nonlinear programming
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- The Convergence of a Class of Double-rank Minimization Algorithms
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Two new conjugate gradient methods based on modified secant equations
- Using function-values in multi-step quasi-Newton methods
Cited in
(10)- A hybrid BB-type method for solving large scale unconstrained optimization
- A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach
- A modified scaling parameter for the memoryless BFGS updating formula
- Two modified scaled nonlinear conjugate gradient methods
- A modified Perry conjugate gradient method and its global convergence
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing
- Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- Using a modied secant equation for unconstrained optimization
This page was built for publication: A modified BFGS algorithm based on a hybrid secant equation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q763667)