On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces
From MaRDI portal
Publication:2805982
Abstract: This work focuses on convergence analysis of the projected gradient method for solving constrained convex minimization problem in Hilbert spaces. We show that the sequence of points generated by the method employing the Armijo linesearch converges weakly to a solution of the considered convex optimization problem. Weak convergence is established by assuming convexity and Gateaux differentiability of the objective function, whose Gateaux derivative is supposed to be uniformly continuous on bounded sets. Furthermore, we propose some modifications in the classical projected gradient method in order to obtain strong convergence. The new variant has the following desirable properties: the sequence of generated points is entirely contained in a ball with diameter equal to the distance between the initial point and the solution set, and the whole sequence converges strongly to the solution of the problem that lies closest to the initial iterate. Convergence analysis of both methods is presented without Lipschitz continuity assumption.
Recommendations
- On the convergence properties of the projected gradient method for convex optimization
- Strong convergence of the gradient-projection algorithm in Hilbert spaces
- On the convergence of the projected gradient method for vector optimization
- Strong convergence of a projected gradient method
- On the strong convergence of a projection-based algorithm in Hilbert spaces
- Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization
- On the convergence of the projected gradient method
- On the gradient projection method for weakly convex functions on a proximally smooth set
- Strong convergence of projected reflected gradient methods for variational inequalities
- Convergence of the gradient projection method for generalized convex minimization
Cites work
- scientific article; zbMATH DE number 4164577 (Why is no real title available?)
- A Mesh-Independence Principle for Operator Equations and Their Discretizations
- A Strongly Convergent Direct Method for Monotone Variational Inequalities in Hilbert Spaces
- A Weak-to-Strong Convergence Principle for Fejér-Monotone Methods in Hilbert Spaces
- A strongly convergent method for nonsmooth convex minimization in Hilbert spaces
- Application of the Mesh Independence Principle to Mesh Refinement Strategies
- Averaged mappings and the gradient-projection algorithm
- Convergence of a projected gradient method variant for quasiconvex objectives
- Convergence theorems for sequences of nonlinear operators in Banach spaces
- Convex analysis and monotone operator theory in Hilbert spaces
- Convex programming in Hilbert space
- Entropy-Like Proximal Methods in Convex Programming
- Forcing strong convergence of proximal point iterations in a Hilbert space
- Full convergence of the steepest descent method with inexact line searches
- Level bundle-like algorithms for convex optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- Minimization of unsmooth functionals
- New Proximal Point Algorithms for Convex Minimization
- Newton's Mesh Independence Principle for a Class Of Optimal Shape Design Problems
- On the convergence properties of the projected gradient method for convex optimization
- Reconstruction of pictures from their projections
- Strong convergence in Hilbert spaces via \(\varGamma \)-duality
- System modelling and optimization. Proceedings of the 16th IFIP-TC7 conference, Compiègne, France, July 5-9, 1993
Cited in
(10)- Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions
- Strong convergence of a projected gradient method
- On the convergence properties of the projected gradient method for convex optimization
- Weak and norm convergence of a parallel projection method in Hilbert spaces
- On the convergence of the forward-backward splitting method with linesearches
- A strongly convergent method for nonsmooth convex minimization in Hilbert spaces
- Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization
- Projected stochastic gradients for convex constrained problems in Hilbert spaces
- Strong convergence of expected-projection methods in hilbert spaces
- A strongly convergent proximal point method for vector optimization
This page was built for publication: On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2805982)