The following pages link to Saman Babaie-Kafaki (Q297560):
Displaying 50 items.
- A modified scaling parameter for the memoryless BFGS updating formula (Q297561) (← links)
- A class of biased estimators based on QR decomposition (Q307819) (← links)
- Two modified scaled nonlinear conjugate gradient methods (Q390466) (← links)
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei (Q453599) (← links)
- A modified scaled conjugate gradient method with global convergence for nonconvex functions (Q464217) (← links)
- An eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method (Q464539) (← links)
- Two modified three-term conjugate gradient methods with sufficient descent property (Q479259) (← links)
- On the sufficient descent condition of the Hager-Zhang conjugate gradient methods (Q483732) (← links)
- On optimality of two adaptive choices for the parameter of Dai-Liao method (Q518148) (← links)
- A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update (Q523565) (← links)
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates (Q645035) (← links)
- A modified nonmonotone trust region line search method (Q721572) (← links)
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique (Q723782) (← links)
- A modified BFGS algorithm based on a hybrid secant equation (Q763667) (← links)
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae (Q887119) (← links)
- Two new conjugate gradient methods based on modified secant equations (Q972741) (← links)
- Two accelerated nonmonotone adaptive trust region line search methods (Q1652801) (← links)
- Two extensions of the Dai-Liao method with sufficient descent property based on a penalization scheme (Q1653949) (← links)
- Two adaptive Dai-Liao nonlinear conjugate gradient methods (Q1787816) (← links)
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods (Q1937015) (← links)
- On the sufficient descent property of the Shanno's conjugate gradient method (Q1947631) (← links)
- A note on the global convergence of the quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods (Q1955570) (← links)
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length (Q2009059) (← links)
- A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods (Q2017614) (← links)
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing (Q2034433) (← links)
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing (Q2083385) (← links)
- A nonmonotone scaled Fletcher-Reeves conjugate gradient method with application in image reconstruction (Q2091131) (← links)
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing (Q2116059) (← links)
- Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model (Q2165892) (← links)
- A hybrid quasi-Newton method with application in sparse recovery (Q2167383) (← links)
- A modified scaled memoryless symmetric rank-one method (Q2193423) (← links)
- Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing (Q2243965) (← links)
- A heuristic approach to combat multicollinearity in least trimmed squares regression analysis (Q2295249) (← links)
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (Q2336064) (← links)
- A descent extension of the Polak-Ribière-Polyak conjugate gradient method (Q2400710) (← links)
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions (Q2424224) (← links)
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization (Q2441364) (← links)
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices (Q2514763) (← links)
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update (Q2628174) (← links)
- Two penalized mixed-integer nonlinear programming approaches to tackle multicollinearity and outliers effects in linear regression models (Q2666733) (← links)
- A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem (Q2688917) (← links)
- Descent Symmetrization of the Dai–Liao Conjugate Gradient Method (Q2809315) (← links)
- A descent hybrid modification of the Polak–Ribière–Polyak conjugate gradient method (Q2826665) (← links)
- A modified two-point stepsize gradient algorithm for unconstrained minimization (Q2867423) (← links)
- A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter (Q2868907) (← links)
- AN EFFICIENT AND PRACTICALLY ROBUST HYBRID METAHEURISTIC ALGORITHM FOR SOLVING FUZZY BUS TERMINAL LOCATION PROBLEMS (Q2911573) (← links)
- AN ADAPTIVE CONJUGACY CONDITION AND RELATED NONLINEAR CONJUGATE GRADIENT METHODS (Q2971928) (← links)
- Two effective hybrid metaheuristic algorithms for minimization of multimodal functions (Q3101632) (← links)
- An adaptive descent extension of the Polak–Rebière–Polyak conjugate gradient method based on the concept of maximum magnification (Q3390751) (← links)
- A heuristic algorithm to combat outliers and multicollinearity in regression model analysis (Q3390780) (← links)