Pages that link to "Item:Q2493696"
From MaRDI portal
The following pages link to New quasi-Newton methods for unconstrained optimization problems (Q2493696):
Displayed 50 items.
- On Hager and Zhang's conjugate gradient method with guaranteed descent (Q273329) (← links)
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem (Q283997) (← links)
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems (Q295480) (← links)
- A modified scaling parameter for the memoryless BFGS updating formula (Q297561) (← links)
- Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization (Q357379) (← links)
- Scaling on the spectral gradient method (Q368739) (← links)
- Global convergence of a modified limited memory BFGS method for non-convex minimization (Q385195) (← links)
- Two modified scaled nonlinear conjugate gradient methods (Q390466) (← links)
- An improved nonlinear conjugate gradient method with an optimal property (Q476750) (← links)
- New quasi-Newton methods via higher order tensor models (Q629502) (← links)
- Improved Hessian approximation with modified secant equations for symmetric rank-one method (Q629505) (← links)
- An active set limited memory BFGS algorithm for bound constrained optimization (Q638888) (← links)
- Convergence analysis of a modified BFGS method on convex minimizations (Q711385) (← links)
- A conjugate gradient method with sufficient descent property (Q747726) (← links)
- A modified BFGS algorithm based on a hybrid secant equation (Q763667) (← links)
- A new type of quasi-Newton updating formulas based on the new quasi-Newton equation (Q779629) (← links)
- Global convergence properties of two modified BFGS-type methods (Q874351) (← links)
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae (Q887119) (← links)
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems (Q897051) (← links)
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation (Q905328) (← links)
- The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions (Q928232) (← links)
- Two new conjugate gradient methods based on modified secant equations (Q972741) (← links)
- Notes on the Dai-Yuan-Yuan modified spectral gradient method (Q984907) (← links)
- A limited memory BFGS-type method for large-scale unconstrained optimization (Q1004767) (← links)
- A new backtracking inexact BFGS method for symmetric nonlinear equations (Q1031701) (← links)
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization (Q1036299) (← links)
- A new adaptive trust region algorithm for optimization problems (Q1637037) (← links)
- A new adaptive Barzilai and Borwein method for unconstrained optimization (Q1653281) (← links)
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix (Q1677473) (← links)
- Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing (Q1724067) (← links)
- An effective adaptive trust region algorithm for nonsmooth minimization (Q1790684) (← links)
- A new modified BFGS method for unconstrained optimization problems (Q1993498) (← links)
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations (Q2013630) (← links)
- A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations (Q2189341) (← links)
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions (Q2190800) (← links)
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs (Q2252688) (← links)
- Nonmonotone adaptive trust region method with line search based on new diagonal updating (Q2261939) (← links)
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization (Q2301132) (← links)
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems (Q2322819) (← links)
- A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function (Q2327437) (← links)
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization (Q2346397) (← links)
- An adaptive sizing BFGS method for unconstrained optimization (Q2355301) (← links)
- A modified nonmonotone BFGS algorithm for unconstrained optimization (Q2400759) (← links)
- A modified quasi-Newton method for nonlinear equations (Q2406290) (← links)
- New line search methods for unconstrained optimization (Q2510603) (← links)
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices (Q2514763) (← links)
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update (Q2628174) (← links)
- A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization (Q2854330) (← links)
- A CLASS OF MODIFIED BFGS METHODS WITH FUNCTION VALUE INFORMATION FOR UNCONSTRAINED OPTIMIZATION (Q2873833) (← links)
- Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition (Q2969958) (← links)