Pages that link to "Item:Q972741"
From MaRDI portal
The following pages link to Two new conjugate gradient methods based on modified secant equations (Q972741):
Displayed 43 items.
- A modified scaling parameter for the memoryless BFGS updating formula (Q297561) (← links)
- Two modified scaled nonlinear conjugate gradient methods (Q390466) (← links)
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence (Q408488) (← links)
- An improved nonlinear conjugate gradient method with an optimal property (Q476750) (← links)
- Two modified three-term conjugate gradient methods with sufficient descent property (Q479259) (← links)
- A limited memory descent Perry conjugate gradient method (Q518141) (← links)
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique (Q723782) (← links)
- A modified BFGS algorithm based on a hybrid secant equation (Q763667) (← links)
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae (Q887119) (← links)
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems (Q897051) (← links)
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation (Q905328) (← links)
- A class of one parameter conjugate gradient methods (Q1664259) (← links)
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods (Q1937015) (← links)
- Two optimal Hager-Zhang conjugate gradient methods for solving monotone nonlinear equations (Q1986155) (← links)
- A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods (Q2017614) (← links)
- Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations (Q2061399) (← links)
- A new accelerated conjugate gradient method for large-scale unconstrained optimization (Q2068094) (← links)
- Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations (Q2107668) (← links)
- A modified Dai-Kou-type method with applications to signal reconstruction and blurred image restoration (Q2167359) (← links)
- A new smoothing spectral conjugate gradient method for solving tensor complementarity problems (Q2171089) (← links)
- A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations (Q2189341) (← links)
- A modified scaled memoryless symmetric rank-one method (Q2193423) (← links)
- Two-step conjugate gradient method for unconstrained optimization (Q2204167) (← links)
- Descent Perry conjugate gradient methods for systems of monotone nonlinear equations (Q2205641) (← links)
- A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations (Q2279636) (← links)
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems (Q2322819) (← links)
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (Q2336064) (← links)
- A modified Perry conjugate gradient method and its global convergence (Q2355321) (← links)
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization (Q2441364) (← links)
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices (Q2514763) (← links)
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update (Q2628174) (← links)
- A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter (Q2868907) (← links)
- Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition (Q2969958) (← links)
- Modified Hager–Zhang conjugate gradient methods via singular value analysis for solving monotone nonlinear equations with convex constraint (Q3383893) (← links)
- A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family (Q3458820) (← links)
- A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization (Q4959904) (← links)
- A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations (Q5063454) (← links)
- A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization (Q5106116) (← links)
- Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination (Q5277965) (← links)
- A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function (Q5379462) (← links)
- Two hybrid nonlinear conjugate gradient methods based on a modified secant equation (Q5495572) (← links)
- A descent family of Dai–Liao conjugate gradient methods (Q5746716) (← links)
- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters (Q5855678) (← links)