A class of superlinearly convergent projection algorithms with relaxed stepsizes
From MaRDI portal
Publication:1057626
DOI10.1007/BF01449032zbMath0563.65041MaRDI QIDQ1057626
Publication date: 1984
Published in: Applied Mathematics and Optimization (Search for Journal in Brave)
constrained optimizationsuperlinear convergencequasi-Newtonquadratic subproblemGoldstein-Levitin-Polyak projected gradient algorithmrelaxed stepsizes
Related Items (9)
A projected Newton method in a Cartesian product of balls ⋮ A projected Newton method for minimization problems with nonlinear inequality constraints ⋮ A Family of Supermemory Gradient Projection Methods for Constrained Optimization ⋮ Variable metric gradient projection processes in convex feasible sets defined by nonlinear inequalities ⋮ An Extension of the Projected Gradient Method to a Banach Space Setting with Application in Structural Topology Optimization ⋮ Global convergence of a modified gradient projection method for convex constrained problems ⋮ Decreasing the sensitivity of open-loop optimal solutions in decision making under uncertainty ⋮ On the convergence of projected gradient processes to singular critical points ⋮ Equality and inequality constrained optimization algorithms with convergent stepsizes
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Projection methods in constrained optimisation and applications to optimal policy decisions
- A feasible direction algorithm for convex optimization: Global convergence rates
- Newton’s Method and the Goldstein Step-Length Rule for Constrained Minimization Problems
- Global and Asymptotic Convergence Rate Estimates for a Class of Projected Gradient Processes
- On the Local Convergence of Quasi-Newton Methods for Constrained Optimization
- On the Goldstein-Levitin-Polyak gradient projection method
- Quasi-Newton Methods, Motivation and Theory
- Superlinearly convergent variable metric algorithms for general nonlinear programming problems
- Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular Extremals
- Projected Newton Methods for Optimization Problems with Simple Constraints
- Convex programming in Hilbert space
- The Gradient Projection Method under Mild Differentiability Conditions
This page was built for publication: A class of superlinearly convergent projection algorithms with relaxed stepsizes