On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces
From MaRDI portal
Publication:2805982
DOI10.1080/01630563.2015.1080271zbMath1338.90297arXiv1402.5884OpenAlexW2255734980MaRDI QIDQ2805982
Welington de Oliveira, José Yunier Bello Cruz
Publication date: 13 May 2016
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1402.5884
Related Items (4)
Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions ⋮ A strongly convergent proximal point method for vector optimization ⋮ On the convergence of the forward–backward splitting method with linesearches ⋮ Projected Stochastic Gradients for Convex Constrained Problems in Hilbert Spaces
Cites Work
- Unnamed Item
- Strong convergence in Hilbert spaces via \(\varGamma \)-duality
- On the convergence properties of the projected gradient method for convex optimization
- Averaged mappings and the gradient-projection algorithm
- Level bundle-like algorithms for convex optimization
- Convergence of a projected gradient method variant for quasiconvex objectives
- System modelling and optimization. Proceedings of the 16th IFIP-TC7 conference, Compiègne, France, July 5-9, 1993
- Forcing strong convergence of proximal point iterations in a Hilbert space
- Convergence theorems for sequences of nonlinear operators in Banach spaces
- Minimization of functions having Lipschitz continuous first partial derivatives
- A Strongly Convergent Method for Nonsmooth Convex Minimization in Hilbert Spaces
- A Strongly Convergent Direct Method for Monotone Variational Inequalities in Hilbert Spaces
- A Mesh-Independence Principle for Operator Equations and Their Discretizations
- Application of the Mesh Independence Principle to Mesh Refinement Strategies
- New Proximal Point Algorithms for Convex Minimization
- Newton's Mesh Independence Principle for a Class Of Optimal Shape Design Problems
- Entropy-Like Proximal Methods in Convex Programming
- Full convergence of the steepest descent method with inexact line searches
- Convex programming in Hilbert space
- Reconstruction of pictures from their projections
- Minimization of unsmooth functionals
- A Weak-to-Strong Convergence Principle for Fejér-Monotone Methods in Hilbert Spaces
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces