Pages that link to "Item:Q1306664"
From MaRDI portal
The following pages link to New quasi-Newton equation and related methods for unconstrained optimization (Q1306664):
Displaying 50 items.
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem (Q283997) (← links)
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems (Q295480) (← links)
- A modified scaling parameter for the memoryless BFGS updating formula (Q297561) (← links)
- Scaling damped limited-memory updates for unconstrained optimization (Q306311) (← links)
- Scaling on the spectral gradient method (Q368739) (← links)
- Global convergence of a modified limited memory BFGS method for non-convex minimization (Q385195) (← links)
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence (Q408488) (← links)
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization (Q438775) (← links)
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization (Q442712) (← links)
- A combined class of self-scaling and modified quasi-Newton methods (Q453615) (← links)
- An improved nonlinear conjugate gradient method with an optimal property (Q476750) (← links)
- An active set limited memory BFGS algorithm for bound constrained optimization (Q638888) (← links)
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates (Q645035) (← links)
- A modified BFGS algorithm based on a hybrid secant equation (Q763667) (← links)
- A new type of quasi-Newton updating formulas based on the new quasi-Newton equation (Q779629) (← links)
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition (Q849150) (← links)
- Global convergence of a memory gradient method for unconstrained optimization (Q861515) (← links)
- A compact limited memory method for large scale unconstrained optimization (Q869145) (← links)
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae (Q887119) (← links)
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems (Q897051) (← links)
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation (Q905328) (← links)
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization (Q953210) (← links)
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems (Q964960) (← links)
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization (Q970585) (← links)
- Two new conjugate gradient methods based on modified secant equations (Q972741) (← links)
- Notes on the Dai-Yuan-Yuan modified spectral gradient method (Q984907) (← links)
- A limited memory BFGS-type method for large-scale unconstrained optimization (Q1004767) (← links)
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization (Q1036299) (← links)
- A truncated descent HS conjugate gradient method and its global convergence (Q1036485) (← links)
- Variable metric methods for unconstrained optimization and nonlinear least squares (Q1593813) (← links)
- A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method (Q1634798) (← links)
- A new adaptive Barzilai and Borwein method for unconstrained optimization (Q1653281) (← links)
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization (Q1670017) (← links)
- A modified conjugacy condition and related nonlinear conjugate gradient method (Q1718989) (← links)
- Quasi-Newton methods for multiobjective optimization problems (Q1728407) (← links)
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization (Q1751056) (← links)
- A new modified BFGS method for unconstrained optimization problems (Q1993498) (← links)
- Convergence analysis of an improved BFGS method and its application in the Muskingum model (Q2007100) (← links)
- Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems (Q2028038) (← links)
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing (Q2034433) (← links)
- Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations (Q2061399) (← links)
- Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations (Q2063152) (← links)
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing (Q2083385) (← links)
- Solving nonlinear monotone operator equations via modified SR1 update (Q2143842) (← links)
- Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model (Q2165892) (← links)
- A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration (Q2167359) (← links)
- A hybrid quasi-Newton method with application in sparse recovery (Q2167383) (← links)
- A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations (Q2189341) (← links)
- A class of accelerated conjugate-gradient-like methods based on a modified secant equation (Q2190281) (← links)
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions (Q2190800) (← links)