On the asymptotic behavior of the Douglas-Rachford and proximal-point algorithms for convex optimization
From MaRDI portal
(Redirected from Publication:2230792)
Abstract: The authors in (Banjac et al., 2019) recently showed that the Douglas-Rachford algorithm provides certificates of infeasibility for a class of convex optimization problems. In particular, they showed that the difference between consecutive iterates generated by the algorithm converges to certificates of primal and dual strong infeasibility. Their result was shown in a finite dimensional Euclidean setting and for a particular structure of the constraint set. In this paper, we extend the result to real Hilbert spaces and a general nonempty closed convex set. Moreover, we show that the proximal-point algorithm applied to the set of optimality conditions of the problem generates similar infeasibility certificates.
Recommendations
- On Slater's condition and finite convergence of the Douglas-Rachford algorithm for solving convex feasibility problems in Euclidean spaces
- On the finite convergence of the Douglas-Rachford algorithm for solving (not necessarily convex) feasibility problems in Euclidean spaces
- On the local convergence of the Douglas-Rachford algorithm
- The Douglas-Rachford algorithm in the affine-convex case
- A convergent relaxation of the Douglas-Rachford algorithm
Cites work
- scientific article; zbMATH DE number 3671159 (Why is no real title available?)
- scientific article; zbMATH DE number 3441150 (Why is no real title available?)
- Asymptotic behavior of contractions in Hilbert space
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Convex analysis and monotone operator theory in Hilbert spaces
- Douglas-Rachford splitting and ADMM for pathological convex optimization
- FBstab: a proximally stabilized semismooth algorithm for convex quadratic programming
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Infeasibility detection in the alternating direction method of multipliers for convex optimization
- OSQP: an operator splitting solver for quadratic programs
- On the Behavior of the Douglas--Rachford Algorithm for Minimizing a Convex Function Subject to a Linear Constraint
- On the Douglas-Rachford algorithm
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- Shape restricted smoothing splines via constrained optimal control and nonsmooth Newton's methods
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- The Douglas-Rachford algorithm for two (not necessarily intersecting) affine subspaces
- The Douglas-Rachford algorithm in the affine-convex case
- Weakly homogeneous variational inequalities and solvability of nonlinear equations over cones
Cited in
(4)- scientific article; zbMATH DE number 7201315 (Why is no real title available?)
- On the Behavior of the Douglas--Rachford Algorithm for Minimizing a Convex Function Subject to a Linear Constraint
- scientific article; zbMATH DE number 5586189 (Why is no real title available?)
- On a primal-dual Newton proximal method for convex quadratic programs
This page was built for publication: On the asymptotic behavior of the Douglas-Rachford and proximal-point algorithms for convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2230792)