A Strongly Convergent Method for Nonsmooth Convex Minimization in Hilbert Spaces
From MaRDI portal
Publication:3114576
DOI10.1080/01630563.2011.590914zbMath1232.90319MaRDI QIDQ3114576
José Yunier Bello Cruz, Alfredo Noel Iusem
Publication date: 19 February 2012
Published in: Numerical Functional Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01630563.2011.590914
nonsmooth optimization; strong convergence; projection method; convex minimization; projected subgradient algorithm
Related Items
Multiple subgradient descent bundle method for convex nonsmooth multiobjective optimization, Subgradient algorithms for solving variable inequalities, A relaxed-projection splitting algorithm for variational inequalities in Hilbert spaces, Strong convergence in Hilbert spaces via \(\varGamma \)-duality, A subgradient-like algorithm for solving vector convex inequalities, Level bundle-like algorithms for convex optimization, Bundle-based descent method for nonsmooth multiobjective DC optimization with inequality constraints, On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions, Modified basic projection methods for a class of equilibrium problems, Approximation of common solutions for system of equilibrium problems and fixed-point problems, A strongly convergent proximal bundle method for convex minimization in Hilbert spaces, On Weak and Strong Convergence of the Projected Gradient Method for Convex Optimization in Real Hilbert Spaces, A variant of forward-backward splitting method for the sum of two monotone operators with a new search strategy
Cites Work
- Convergence of a projected gradient method variant for quasiconvex objectives
- System modelling and optimization. Proceedings of the 16th IFIP-TC7 conference, Compiègne, France, July 5-9, 1993
- Convergence of the gradient projection method for generalized convex minimization
- Forcing strong convergence of proximal point iterations in a Hilbert space
- Convergence of the steepest descent method for minimizing quasiconvex functions
- A Strongly Convergent Direct Method for Monotone Variational Inequalities in Hilbert Spaces
- A Mesh-Independence Principle for Operator Equations and Their Discretizations
- Application of the Mesh Independence Principle to Mesh Refinement Strategies
- New Proximal Point Algorithms for Convex Minimization
- Newton's Mesh Independence Principle for a Class Of Optimal Shape Design Problems
- Full convergence of the steepest descent method with inexact line searches
- Reconstruction of pictures from their projections
- Minimization of unsmooth functionals
- A Weak-to-Strong Convergence Principle for Fejér-Monotone Methods in Hilbert Spaces
- Extension of subgradient techniques for nonsmooth optimization in Banach spaces