The following pages link to Zohre Aminifard (Q2009058):
Displaying 27 items.
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length (Q2009059) (← links)
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing (Q2034433) (← links)
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing (Q2083385) (← links)
- A nonmonotone scaled Fletcher-Reeves conjugate gradient method with application in image reconstruction (Q2091131) (← links)
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing (Q2116059) (← links)
- Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model (Q2165892) (← links)
- A hybrid quasi-Newton method with application in sparse recovery (Q2167383) (← links)
- Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing (Q2243965) (← links)
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (Q2336064) (← links)
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions (Q2424224) (← links)
- Two penalized mixed-integer nonlinear programming approaches to tackle multicollinearity and outliers effects in linear regression models (Q2666733) (← links)
- A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem (Q2688917) (← links)
- An adaptive descent extension of the Polak–Rebière–Polyak conjugate gradient method based on the concept of maximum magnification (Q3390751) (← links)
- MATRIX ANALYSES ON THE DAI–LIAO CONJUGATE GRADIENT METHOD (Q4966642) (← links)
- (Q4986670) (← links)
- Improved high-dimensional regression models with matrix approximations applied to the comparative case studies with support vector machines (Q5058399) (← links)
- Improving the Dai-Liao parameter choices using a fixed point equation (Q5080095) (← links)
- A restart scheme for the Dai–Liao conjugate gradient method by ignoring a direction of maximum magnification by the search direction matrix (Q5109818) (← links)
- An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method (Q6053540) (← links)
- Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing (Q6109887) (← links)
- Eigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equation (Q6142074) (← links)
- A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM (Q6142230) (← links)
- A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions (Q6184788) (← links)
- A class of CG algorithms overcoming jamming of the iterative solving process and its application in image restoration (Q6489243) (← links)
- A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem (Q6574066) (← links)
- A nonmonotone adaptive trust region technique with a forgetting factor (Q6581413) (← links)
- Accelerated nonmonotone line search technique for multiobjective optimization (Q6593954) (← links)