Incremental subgradient method for nonsmooth convex optimization with fixed point constraints
DOI10.1080/10556788.2016.1175002zbMATH Open1354.65124OpenAlexW2346898445MaRDI QIDQ2829569FDOQ2829569
Authors: Hideaki Iiduka
Publication date: 8 November 2016
Published in: Optimization Methods \& Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2016.1175002
Recommendations
- Incremental proximal method for nonsmooth convex optimization with fixed point constraints of quasi-nonexpansive mappings
- Proximal point algorithms for nonsmooth convex optimization with fixed point constraints
- Convergence analysis of iterative methods for nonsmooth convex optimization over fixed point sets of quasi-nonexpansive mappings
- Incremental subgradient methods for nondifferentiable optimization in a Hilbert space
- Parallel computing subgradient method for nonsmooth convex optimization over the intersection of fixed point sets of nonexpansive mappings
convergenceHilbert spacefixed pointnonexpansive mappingnumerical examplesubdifferentialnonsmooth convex optimizationincremental subgradient methodKrasnosel'skiĭ-Mann algorithm
Numerical mathematical programming methods (65K05) Convex programming (90C25) Applications of mathematical programming (90C90)
Cites Work
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- On Projection Algorithms for Solving Convex Feasibility Problems
- Convex analysis and monotone operator theory in Hilbert spaces
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- Title not available (Why is that?)
- Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- The viscosity approximation process for quasi-nonexpansive mappings in Hilbert spaces
- A Use of Conjugate Gradient Direction for the Convex Optimization Problem over the Fixed Point Set of a Nonexpansive Mapping
- Iterative algorithm for triple-hierarchical constrained nonconvex optimization problem and its application to network bandwidth allocation
- Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems
- Iterative algorithm for solving triple-hierarchical constrained optimization problem
- A Convergent Incremental Gradient Method with a Constant Step Size
- Title not available (Why is that?)
- A Randomized Incremental Subgradient Method for Distributed Optimization in Networked Systems
- A block-iterative surrogate constraint splitting method for quadratic signal recovery
- VI-constrained hemivariational inequalities: distributed algorithms and power control in ad-hoc networks
- Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping
- Incremental subgradient methods for nondifferentiable optimization
- Computational method for solving a stochastic linear-quadratic control problem given an unsolvable stochastic algebraic Riccati equation
- A proximal decomposition method for solving convex variational inverse problems
- Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms
- Convergence analysis of iterative methods for nonsmooth convex optimization over fixed point sets of quasi-nonexpansive mappings
Cited In (25)
- Computation time of iterative methods for nonsmooth convex optimization with fixed point constraints of quasi-nonexpansive mappings
- Stochastic subgradient algorithm for nonsmooth nonconvex optimization
- Title not available (Why is that?)
- Incremental gradient projection algorithm for constrained composite minimization problems
- Incremental quasi-Newton algorithms for solving a nonconvex, nonsmooth, finite-sum optimization problem
- Parallel computing subgradient method for nonsmooth convex optimization over the intersection of fixed point sets of nonexpansive mappings
- Iterative methods for parallel convex optimization with fixed point constraints
- Fejér-monotone hybrid steepest descent method for affinely constrained and composite convex minimization
- Nonexpansiveness of a linearized augmented Lagrangian operator for hierarchical convex optimization
- Convergence analysis of iterative methods for nonsmooth convex optimization over fixed point sets of quasi-nonexpansive mappings
- Fixed point quasiconvex subgradient method
- Incremental subgradient methods for nondifferentiable optimization in a Hilbert space
- Two stochastic optimization algorithms for convex optimization with fixed point constraints
- Proximal point algorithms for nonsmooth convex optimization with fixed point constraints
- Primal-dual incremental gradient method for nonsmooth and convex optimization problems
- Minimizing the Moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings
- Nonsmooth projection-free optimization with functional constraints
- Adaptive Projected Subgradient Method for Asymptotic Minimization of Sequence of Nonnegative Convex Functions
- String-averaging incremental subgradients for constrained convex optimization with applications to reconstruction of tomographic images
- Parallel subgradient method for nonsmooth convex optimization with a simple constraint
- Incremental subgradients for constrained convex optimization: A unified framework and new methods
- Convergence of a distributed method for minimizing sum of convex functions with fixed point constraints
- Incremental proximal method for nonsmooth convex optimization with fixed point constraints of quasi-nonexpansive mappings
- A Randomized Incremental Subgradient Method for Distributed Optimization in Networked Systems
- Path-based incremental target level algorithm on Riemannian manifolds
This page was built for publication: Incremental subgradient method for nonsmooth convex optimization with fixed point constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2829569)