Activity identification and local linear convergence of Douglas-Rachford/ADMM under partial smoothness
From MaRDI portal
Publication:3300345
Abstract: Convex optimization has become ubiquitous in most quantitative disciplines of science, including variational image processing. Proximal splitting algorithms are becoming popular to solve such structured convex optimization problems. Within this class of algorithms, Douglas--Rachford (DR) and alternating direction method of multipliers (ADMM) are designed to minimize the sum of two proper lower semi-continuous convex functions whose proximity operators are easy to compute. The goal of this work is to understand the local convergence behaviour of DR (resp. ADMM) when the involved functions (resp. their Legendre-Fenchel conjugates) are moreover partly smooth. More precisely, when both of the two functions (resp. their conjugates) are partly smooth relative to their respective manifolds, we show that DR (resp. ADMM) identifies these manifolds in finite time. Moreover, when these manifolds are affine or linear, we prove that DR/ADMM is locally linearly convergent. When and are locally polyhedral, we show that the optimal convergence radius is given in terms of the cosine of the Friedrichs angle between the tangent spaces of the identified manifolds. This is illustrated by several concrete examples and supported by numerical experiments.
Recommendations
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- Convergence rates of forward-Douglas-Rachford splitting method
- Local linear convergence of the ADMM/Douglas-Rachford algorithms without strong convexity and application to statistical imaging
- Convergence analysis of Douglas-Rachford splitting method for ``strongly + weakly convex programming
- Local linear convergence analysis of primal-dual splitting methods
Cites work
- scientific article; zbMATH DE number 2155014 (Why is no real title available?)
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A generalized forward-backward splitting
- A proximal decomposition method for solving convex variational inverse problems
- A variational approach to remove outliers and impulse noise
- Active Sets, Nonsmoothness, and Sensitivity
- Alternating Projections and Douglas-Rachford for Sparse Affine Feasibility
- Convex analysis and monotone operator theory in Hilbert spaces
- Eventual linear convergence of the Douglas-Rachford iteration for basis pursuit
- Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions
- Identifying active manifolds.
- Linear Convergence and Metric Selection for Douglas-Rachford Splitting and ADMM
- Local linear convergence for alternating and averaged nonconvex projections
- Local linear convergence of the alternating direction method of multipliers on quadratic or linear programs
- Model selection with low complexity priors
- Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems
- Orthogonal invariance and identifiability
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- The Douglas-Rachford algorithm in the absence of convexity
- The rate of linear convergence of the Douglas-Rachford algorithm for subspaces is the cosine of the Friedrichs angle
- Variational Analysis
Cited in
(15)- Local linear convergence of the ADMM/Douglas-Rachford algorithms without strong convexity and application to statistical imaging
- On the interplay between acceleration and identification for the proximal gradient algorithm
- Local linear convergence of a primal-dual algorithm for the augmented convex models
- Partial smoothness and constant rank
- Convergence rates of forward-Douglas-Rachford splitting method
- The degrees of freedom of partly smooth regularizers
- On the finite convergence of the Douglas-Rachford algorithm for solving (not necessarily convex) feasibility problems in Euclidean spaces
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- Active-set Newton methods and partial smoothness
- Convergence rates with inexact non-expansive operators
- Local linear convergence analysis of primal-dual splitting methods
- Douglas-Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms
- On the global and linear convergence of the generalized alternating direction method of multipliers
- A weakly convergent fully inexact Douglas-Rachford method with relative error tolerance
- Activity identification and local linear convergence of forward-backward-type methods
This page was built for publication: Activity identification and local linear convergence of Douglas-Rachford/ADMM under partial smoothness
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3300345)