Pages that link to "Item:Q1133148"
From MaRDI portal
The following pages link to Stepsize analysis for descent methods (Q1133148):
Displaying 19 items.
- Accelerating convergence in minisum location problem with \(\ell p\) norms (Q336624) (← links)
- New cautious BFGS algorithm based on modified Armijo-type line search (Q385833) (← links)
- Modified nonmonotone Armijo line search for descent method (Q535246) (← links)
- Solution of location problems with radial cost functions (Q584914) (← links)
- A gradient-related algorithm with inexact line searches (Q596214) (← links)
- On the convergence of the descent methods (Q762759) (← links)
- New inexact line search method for unconstrained optimization (Q850832) (← links)
- Accelerating convergence in the Fermat-Weber location problem (Q1273094) (← links)
- A new descent algorithm with curve search rule (Q1764727) (← links)
- Convergence of line search methods for unconstrained optimization (Q1881700) (← links)
- A survey of gradient methods for solving nonlinear optimization (Q2220680) (← links)
- Comparing gradient descent with automatic differentiation and particle swarm optimization techniques for estimating tumor blood flow parameters in contrast-enhanced imaging (Q2291881) (← links)
- Memory gradient method with Goldstein line search (Q2469911) (← links)
- Convergence of descent method without line search (Q2570691) (← links)
- A new super-memory gradient method with curve search rule (Q2571993) (← links)
- A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions (Q2671453) (← links)
- Computer Algebra and Line Search (Q3086910) (← links)
- A new class of memory gradient methods with inexact line searches (Q4675852) (← links)
- Error bound conditions and convergence of optimization methods on smooth and proximally smooth manifolds (Q5070619) (← links)