Pages that link to "Item:Q5619730"
From MaRDI portal
The following pages link to Convergence Conditions for Ascent Methods. II: Some Corrections (Q5619730):
Displaying 50 items.
- On three-term conjugate gradient algorithms for unconstrained optimization (Q120733) (← links)
- A new three-term conjugate gradient algorithm for unconstrained optimization (Q120735) (← links)
- Variable metric random pursuit (Q263217) (← links)
- An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property (Q267707) (← links)
- Some modified conjugate gradient methods for unconstrained optimization (Q277201) (← links)
- Spectral method and its application to the conjugate gradient method (Q279183) (← links)
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (Q315517) (← links)
- Convergence and stability of line search methods for unconstrained optimization (Q385584) (← links)
- Two modified scaled nonlinear conjugate gradient methods (Q390466) (← links)
- A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties (Q415335) (← links)
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei (Q453599) (← links)
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization (Q457047) (← links)
- Modified nonmonotone Armijo line search for descent method (Q535246) (← links)
- A sufficient descent LS conjugate gradient method for unconstrained optimization problems (Q654620) (← links)
- An adaptive scaled BFGS method for unconstrained optimization (Q684183) (← links)
- Analysis of a self-scaling quasi-Newton method (Q689143) (← links)
- Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization (Q737229) (← links)
- A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization (Q747220) (← links)
- A new type of quasi-Newton updating formulas based on the new quasi-Newton equation (Q779629) (← links)
- A modified bat algorithm with conjugate gradient method for global optimization (Q779965) (← links)
- Convergence of descent method with new line search (Q815995) (← links)
- Convergence of quasi-Newton method with new inexact line search (Q819030) (← links)
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems (Q849738) (← links)
- New inexact line search method for unconstrained optimization (Q850832) (← links)
- A constrained optimization approach to solving certain systems of convex equations (Q853007) (← links)
- Global convergence properties of the two new dependent Fletcher-Reeves conjugate gradient methods (Q856066) (← links)
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization (Q875393) (← links)
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization (Q878996) (← links)
- New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization (Q989146) (← links)
- Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping (Q1016416) (← links)
- A truncated descent HS conjugate gradient method and its global convergence (Q1036485) (← links)
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients (Q1176574) (← links)
- Convergence proof of minimization algorithms for nonconvex functions (Q1232247) (← links)
- Convergence properties of the Beale-Powell restart algorithm (Q1286637) (← links)
- Modifications of the Wolfe line search rules to satisfy second-order optimality conditions in unconstrained optimization (Q1384067) (← links)
- A robust descent type algorithm for geophysical inversion through adaptive regularization (Q1614201) (← links)
- An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search (Q1664598) (← links)
- A new conjugate gradient algorithm with sufficient descent property for unconstrained optimization (Q1665439) (← links)
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization (Q1670017) (← links)
- A double parameter scaled BFGS method for unconstrained optimization (Q1677470) (← links)
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix (Q1677473) (← links)
- Smoothed \(\ell_1\)-regularization-based line search for sparse signal recovery (Q1701931) (← links)
- The hybrid BFGS-CG method in solving unconstrained optimization problems (Q1724270) (← links)
- Cubic regularization in symmetric rank-1 quasi-Newton methods (Q1741108) (← links)
- On obtaining optimal well rates and placement for CO\(_2\) storage (Q1785164) (← links)
- The revised DFP algorithm without exact line search (Q1811670) (← links)
- On convergence of minimization methods: Attraction, repulsion, and selection (Q1841569) (← links)
- From linear to nonlinear iterative methods (Q1873166) (← links)
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods (Q1937015) (← links)
- On the sufficient descent property of the Shanno's conjugate gradient method (Q1947631) (← links)