Some new step-size rules for optimization problems
From MaRDI portal
Publication:5436239
DOI10.1007/S11741-007-0209-8zbMATH Open1142.65366OpenAlexW2036281985MaRDI QIDQ5436239FDOQ5436239
Authors: Qing-Jun Wu, Zengxin Wei
Publication date: 14 January 2008
Published in: Journal of Shanghai University (English Edition) (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11741-007-0209-8
Recommendations
- On step-size estimation of line search methods
- A step size rule for unconstrained optimization
- Global convergence results for a new three-term conjugate gradient method with Armijo step size rule
- scientific article; zbMATH DE number 962459
- A simple adaptive step-size choice for iterative optimization methods
convergenceglobal convergenceunconstrained minimizationline search methodsArmijo step-size ruleArmijo-Goldstein step-size ruleWolfe-Powell step-size rule
Cites Work
- A new method for nonsmooth convex optimization
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Convergence analysis of a proximal newton method1
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Nonmonotone Line Search Technique for Newton’s Method
- Minimization of functions having Lipschitz continuous first partial derivatives
- Asymptotic Convergence Analysis of Some Inexact Proximal Point Algorithms for Minimization
- New Proximal Point Algorithms for Convex Minimization
- A modification of Armijo's step-size rule for negative curvature
Cited In (3)
This page was built for publication: Some new step-size rules for optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5436239)