Fejér-monotone hybrid steepest descent method for affinely constrained and composite convex minimization
From MaRDI portal
Publication:4646540
DOI10.1080/02331934.2018.1505885zbMATH Open1416.90034arXiv1608.02500OpenAlexW2963289483MaRDI QIDQ4646540FDOQ4646540
Konstantinos Slavakis, Isao Yamada
Publication date: 14 January 2019
Published in: Optimization (Search for Journal in Brave)
Abstract: This paper introduces the Fej'er-monotone hybrid steepest descent method (FM-HSDM), a new member to the HSDM family of algorithms, for solving affinely constrained minimization tasks in real Hilbert spaces, where convex smooth and non-smooth losses compose the objective function. FM-HSDM offers sequences of estimates which converge weakly and, under certain hypotheses, strongly to solutions of the task at hand. Fixed-point theory, variational inequalities and affine-nonexpansive mappings are utilized to devise a scheme that accommodates affine constraints in a more versatile way than state-of-the-art primal-dual techniques and the alternating direction method of multipliers do. Recursions can be tuned to score low computational footprints, well-suited for large-scale optimization tasks, without compromising convergence guarantees. In contrast to its HSDM's precursors, FM-HSDM enjoys Fej'er monotonicity, the step-size parameter stays constant across iterations to promote convergence speed-ups of the sequence of estimates to a minimizer, while only Lipschitzian continuity, and not strong monotonicity, of the derivative of the smooth-loss function is needed to ensure convergence. Results on the rate of convergence to an optimal point are also presented.
Full work available at URL: https://arxiv.org/abs/1608.02500
Recommendations
- A hybrid steepest descent method for constrained convex optimization
- Incremental proximal method for nonsmooth convex optimization with fixed point constraints of quasi-nonexpansive mappings
- A hybrid descent method for solving a convex constrained optimization problem with applications
- Incremental gradient projection algorithm for constrained composite minimization problems
- Incremental subgradient method for nonsmooth convex optimization with fixed point constraints
Convex programming (90C25) Numerical methods for variational inequalities and related problems (65K15)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Title not available (Why is that?)
- On Projection Algorithms for Solving Convex Feasibility Problems
- Convex analysis and monotone operator theory in Hilbert spaces
- Convex Analysis
- Generalized inverses. Theory and applications.
- A first-order primal-dual algorithm for convex problems with applications to imaging
- The hybrid steepest descent method for the variational inequality problem over the intersection of fixed point sets of nonexpansive mappings
- Quelques propriétés des opérateurs angle-bornes et n-cycliquement monotones
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Primal-dual splitting algorithm for solving inclusions with mixtures of composite, Lipschitzian, and parallel-sum type monotone operators
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Hybrid Steepest Descent Method for Variational Inequality Problem over the Fixed Point Set of Certain Quasi-nonexpansive Mappings
- EXTRA: An Exact First-Order Algorithm for Decentralized Consensus Optimization
- Stochastic forward Douglas-Rachford splitting method for monotone inclusions
- Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping
- Minimizing the Moreau Envelope of Nonsmooth Convex Functions over the Fixed Point Set of Certain Quasi-Nonexpansive Mappings
- Preconditioned Douglas--Rachford Splitting Methods for Convex-concave Saddle-point Problems
- Title not available (Why is that?)
- NON-STRICTLY CONVEX MINIMIZATION OVER THE FIXED POINT SET OF AN ASYMPTOTICALLY SHRINKING NONEXPANSIVE MAPPING
- A three-operator splitting scheme and its optimization applications
- Compositions and convex combinations of averaged nonexpansive operators
- Three-term conjugate gradient method for the convex optimization problem over the fixed point set of a nonexpansive mapping
- A Proximal Gradient Algorithm for Decentralized Composite Optimization
- Nonexpansiveness of a linearized augmented Lagrangian operator for hierarchical convex optimization
- Fejér-monotone hybrid steepest descent method for affinely constrained and composite convex minimization tasks
Cited In (3)
- The Stochastic Fejér-Monotone Hybrid Steepest Descent Method and the Hierarchical RLS
- Two modified inertial projection algorithms for bilevel pseudomonotone variational inequalities with applications to optimal control problems
- Fejér-monotone hybrid steepest descent method for affinely constrained and composite convex minimization tasks
This page was built for publication: Fejér-monotone hybrid steepest descent method for affinely constrained and composite convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4646540)