Convergence of a distributed method for minimizing sum of convex functions with fixed point constraints
From MaRDI portal
Publication:2073009
Recommendations
- Iterative methods for parallel convex optimization with fixed point constraints
- Incremental subgradient method for nonsmooth convex optimization with fixed point constraints
- Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms
- Incremental proximal method for nonsmooth convex optimization with fixed point constraints of quasi-nonexpansive mappings
- Convergence analysis of iterative methods for nonsmooth convex optimization over fixed point sets of quasi-nonexpansive mappings
Cites work
- scientific article; zbMATH DE number 3359250 (Why is no real title available?)
- A descent SQP alternating direction method for minimizing the sum of three convex functions
- Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms
- An efficient iterative method for finding common fixed point and variational inequalities in Hilbert spaces
- Computational method for solving a stochastic linear-quadratic control problem given an unsolvable stochastic algebraic Riccati equation
- Convergence analysis for proximal split feasibility problems and fixed point problems
- Convex analysis and monotone operator theory in Hilbert spaces
- Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions
- Extension of Fenchel's duality theorem for convex functions
- Extrapolated sequential constraint method for variational inequality over the intersection of fixed-point sets
- First-order methods in optimization
- Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems
- Hierarchical Convex Optimization With Primal-Dual Splitting
- Iterative algorithm for triple-hierarchical constrained nonconvex optimization problem and its application to network bandwidth allocation
- Iterative methods for fixed point problems in Hilbert spaces
- Iterative methods for parallel convex optimization with fixed point constraints
- Minimizing the Moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings
- Proximal point algorithms for nonsmooth convex optimization with fixed point constraints
- Robust Wideband Beamforming by the Hybrid Steepest Descent Method
- Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization
- The Gradient Projection Method for Nonlinear Programming. Part II. Nonlinear Constraints
- The hybrid steepest descent method for the variational inequality problem over the intersection of fixed point sets of nonexpansive mappings
Cited in
(4)- An asynchronous subgradient-proximal method for solving additive convex optimization problems
- EFIX: exact fixed point methods for distributed optimization
- A Distributed Algorithm for Computing a Common Fixed Point of a Finite Family of Paracontractions
- A dynamic distributed conjugate gradient method for variational inequality problem over the common fixed-point constraints
This page was built for publication: Convergence of a distributed method for minimizing sum of convex functions with fixed point constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2073009)