On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces
From MaRDI portal
Publication:2805982
DOI10.1080/01630563.2015.1080271zbMATH Open1338.90297arXiv1402.5884OpenAlexW2255734980MaRDI QIDQ2805982FDOQ2805982
Authors: Welington de Oliveira, J. Y. Bello Cruz
Publication date: 13 May 2016
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Abstract: This work focuses on convergence analysis of the projected gradient method for solving constrained convex minimization problem in Hilbert spaces. We show that the sequence of points generated by the method employing the Armijo linesearch converges weakly to a solution of the considered convex optimization problem. Weak convergence is established by assuming convexity and Gateaux differentiability of the objective function, whose Gateaux derivative is supposed to be uniformly continuous on bounded sets. Furthermore, we propose some modifications in the classical projected gradient method in order to obtain strong convergence. The new variant has the following desirable properties: the sequence of generated points is entirely contained in a ball with diameter equal to the distance between the initial point and the solution set, and the whole sequence converges strongly to the solution of the problem that lies closest to the initial iterate. Convergence analysis of both methods is presented without Lipschitz continuity assumption.
Full work available at URL: https://arxiv.org/abs/1402.5884
Recommendations
- On the convergence properties of the projected gradient method for convex optimization
- Strong convergence of the gradient-projection algorithm in Hilbert spaces
- On the convergence of the projected gradient method for vector optimization
- Strong convergence of a projected gradient method
- On the strong convergence of a projection-based algorithm in Hilbert spaces
- Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization
- On the convergence of the projected gradient method
- On the gradient projection method for weakly convex functions on a proximally smooth set
- Strong convergence of projected reflected gradient methods for variational inequalities
- Convergence of the gradient projection method for generalized convex minimization
Cites Work
- Convex analysis and monotone operator theory in Hilbert spaces
- Title not available (Why is that?)
- Forcing strong convergence of proximal point iterations in a Hilbert space
- Convex programming in Hilbert space
- Convergence theorems for sequences of nonlinear operators in Banach spaces
- Minimization of functions having Lipschitz continuous first partial derivatives
- A Weak-to-Strong Convergence Principle for Fejér-Monotone Methods in Hilbert Spaces
- Averaged mappings and the gradient-projection algorithm
- Level bundle-like algorithms for convex optimization
- Entropy-Like Proximal Methods in Convex Programming
- A Mesh-Independence Principle for Operator Equations and Their Discretizations
- Newton's Mesh Independence Principle for a Class Of Optimal Shape Design Problems
- A strongly convergent method for nonsmooth convex minimization in Hilbert spaces
- A Strongly Convergent Direct Method for Monotone Variational Inequalities in Hilbert Spaces
- Minimization of unsmooth functionals
- Convergence of a projected gradient method variant for quasiconvex objectives
- Full convergence of the steepest descent method with inexact line searches
- Strong convergence in Hilbert spaces via \(\varGamma \)-duality
- New Proximal Point Algorithms for Convex Minimization
- System modelling and optimization. Proceedings of the 16th IFIP-TC7 conference, Compiègne, France, July 5-9, 1993
- Application of the Mesh Independence Principle to Mesh Refinement Strategies
- Reconstruction of pictures from their projections
- On the convergence properties of the projected gradient method for convex optimization
Cited In (10)
- Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions
- On the convergence of the forward-backward splitting method with linesearches
- Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization
- Strong convergence of a projected gradient method
- Weak and norm convergence of a parallel projection method in Hilbert spaces
- A strongly convergent proximal point method for vector optimization
- A strongly convergent method for nonsmooth convex minimization in Hilbert spaces
- On the convergence properties of the projected gradient method for convex optimization
- Strong convergence of expected-projection methods in hilbert spaces
- Projected stochastic gradients for convex constrained problems in Hilbert spaces
This page was built for publication: On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2805982)