Some new step-size rules for optimization problems
From MaRDI portal
Publication:5436239
Recommendations
- On step-size estimation of line search methods
- A step size rule for unconstrained optimization
- Global convergence results for a new three-term conjugate gradient method with Armijo step size rule
- scientific article; zbMATH DE number 962459
- A simple adaptive step-size choice for iterative optimization methods
Cites work
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Nonmonotone Line Search Technique for Newton’s Method
- A modification of Armijo's step-size rule for negative curvature
- A new method for nonsmooth convex optimization
- Asymptotic Convergence Analysis of Some Inexact Proximal Point Algorithms for Minimization
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Convergence analysis of a proximal newton method1
- Minimization of functions having Lipschitz continuous first partial derivatives
- New Proximal Point Algorithms for Convex Minimization
Cited in
(3)
This page was built for publication: Some new step-size rules for optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5436239)