On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces

From MaRDI portal
Publication:2805982

DOI10.1080/01630563.2015.1080271zbMATH Open1338.90297arXiv1402.5884OpenAlexW2255734980MaRDI QIDQ2805982FDOQ2805982


Authors: Welington de Oliveira, J. Y. Bello Cruz Edit this on Wikidata


Publication date: 13 May 2016

Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)

Abstract: This work focuses on convergence analysis of the projected gradient method for solving constrained convex minimization problem in Hilbert spaces. We show that the sequence of points generated by the method employing the Armijo linesearch converges weakly to a solution of the considered convex optimization problem. Weak convergence is established by assuming convexity and Gateaux differentiability of the objective function, whose Gateaux derivative is supposed to be uniformly continuous on bounded sets. Furthermore, we propose some modifications in the classical projected gradient method in order to obtain strong convergence. The new variant has the following desirable properties: the sequence of generated points is entirely contained in a ball with diameter equal to the distance between the initial point and the solution set, and the whole sequence converges strongly to the solution of the problem that lies closest to the initial iterate. Convergence analysis of both methods is presented without Lipschitz continuity assumption.


Full work available at URL: https://arxiv.org/abs/1402.5884




Recommendations




Cites Work


Cited In (10)





This page was built for publication: On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2805982)