Pages that link to "Item:Q5953932"
From MaRDI portal
The following pages link to Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations (Q5953932):
Displaying 50 items.
- On Hager and Zhang's conjugate gradient method with guaranteed descent (Q273329) (← links)
- A modified scaling parameter for the memoryless BFGS updating formula (Q297561) (← links)
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence (Q408488) (← links)
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization (Q438775) (← links)
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization (Q442712) (← links)
- A combined class of self-scaling and modified quasi-Newton methods (Q453615) (← links)
- An improved nonlinear conjugate gradient method with an optimal property (Q476750) (← links)
- Spectral scaling BFGS method (Q604256) (← links)
- New quasi-Newton methods via higher order tensor models (Q629502) (← links)
- An adaptive scaled BFGS method for unconstrained optimization (Q684183) (← links)
- A conjugate gradient method with sufficient descent property (Q747726) (← links)
- A modified BFGS algorithm based on a hybrid secant equation (Q763667) (← links)
- Sufficient descent nonlinear conjugate gradient methods with conjugacy condition (Q849150) (← links)
- Global convergence of a memory gradient method for unconstrained optimization (Q861515) (← links)
- A compact limited memory method for large scale unconstrained optimization (Q869145) (← links)
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae (Q887119) (← links)
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems (Q897051) (← links)
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation (Q905328) (← links)
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization (Q953210) (← links)
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems (Q964960) (← links)
- A new structured quasi-Newton algorithm using partial information on Hessian (Q966094) (← links)
- Two new conjugate gradient methods based on modified secant equations (Q972741) (← links)
- A truncated descent HS conjugate gradient method and its global convergence (Q1036485) (← links)
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization (Q1670017) (← links)
- A double parameter scaled BFGS method for unconstrained optimization (Q1677470) (← links)
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix (Q1677473) (← links)
- A new conjugate gradient method based on quasi-Newton equation for unconstrained optimization (Q1713190) (← links)
- Quasi-Newton methods for multiobjective optimization problems (Q1728407) (← links)
- A new modified BFGS method for unconstrained optimization problems (Q1993498) (← links)
- Some nonlinear conjugate gradient methods based on spectral scaling secant equations (Q2013630) (← links)
- Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems (Q2028038) (← links)
- Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations (Q2061399) (← links)
- A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration (Q2167359) (← links)
- A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations (Q2189341) (← links)
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions (Q2190800) (← links)
- A modified Newton-like method for nonlinear equations (Q2204162) (← links)
- Two-step conjugate gradient method for unconstrained optimization (Q2204167) (← links)
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization (Q2204182) (← links)
- Descent Perry conjugate gradient methods for systems of monotone nonlinear equations (Q2205641) (← links)
- Nonmonotone adaptive trust region method with line search based on new diagonal updating (Q2261939) (← links)
- A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations (Q2279636) (← links)
- Secant update version of quasi-Newton PSB with weighted multisecant equations (Q2301143) (← links)
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems (Q2322819) (← links)
- Limited memory BFGS method based on a high-order tensor model (Q2340525) (← links)
- A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization (Q2346397) (← links)
- An adaptive sizing BFGS method for unconstrained optimization (Q2355301) (← links)
- A modified Perry conjugate gradient method and its global convergence (Q2355321) (← links)
- A new simple model trust-region method with generalized Barzilai-Borwein parameter for large-scale optimization (Q2360814) (← links)
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions (Q2372957) (← links)
- A modified quasi-Newton method for nonlinear equations (Q2406290) (← links)