Activity identification and local linear convergence of Douglas-Rachford/ADMM under partial smoothness
DOI10.1007/978-3-319-18461-6_51zbMATH Open1444.90091arXiv1412.6858OpenAlexW2220926539MaRDI QIDQ3300345FDOQ3300345
Authors: Jingwei Liang, Jalal Fadili, Gabriel Peyré, Russell Luke
Publication date: 28 July 2020
Published in: Lecture Notes in Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1412.6858
Recommendations
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- Convergence rates of forward-Douglas-Rachford splitting method
- Local linear convergence of the ADMM/Douglas-Rachford algorithms without strong convexity and application to statistical imaging
- Convergence analysis of Douglas-Rachford splitting method for ``strongly + weakly convex programming
- Local linear convergence analysis of primal-dual splitting methods
ADMMDouglas-Rachford splittinglocal linear convergencepartial smoothnessfinite activity identification
Cites Work
- Linear Convergence and Metric Selection for Douglas-Rachford Splitting and ADMM
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Variational Analysis
- Convex analysis and monotone operator theory in Hilbert spaces
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Local linear convergence of the alternating direction method of multipliers on quadratic or linear programs
- The rate of linear convergence of the Douglas-Rachford algorithm for subspaces is the cosine of the Friedrichs angle
- The Douglas-Rachford algorithm in the absence of convexity
- Orthogonal invariance and identifiability
- Alternating Projections and Douglas-Rachford for Sparse Affine Feasibility
- Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems
- Local linear convergence for alternating and averaged nonconvex projections
- A proximal decomposition method for solving convex variational inverse problems
- A variational approach to remove outliers and impulse noise
- Eventual linear convergence of the Douglas-Rachford iteration for basis pursuit
- Title not available (Why is that?)
- Active Sets, Nonsmoothness, and Sensitivity
- A generalized forward-backward splitting
- Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions
- Identifying active manifolds.
- Model selection with low complexity priors
Cited In (15)
- On the finite convergence of the Douglas-Rachford algorithm for solving (not necessarily convex) feasibility problems in Euclidean spaces
- On the global and linear convergence of the generalized alternating direction method of multipliers
- The degrees of freedom of partly smooth regularizers
- Local linear convergence analysis of primal-dual splitting methods
- Activity identification and local linear convergence of forward-backward-type methods
- On the interplay between acceleration and identification for the proximal gradient algorithm
- A weakly convergent fully inexact Douglas-Rachford method with relative error tolerance
- Convergence rates with inexact non-expansive operators
- Local linear convergence of the ADMM/Douglas-Rachford algorithms without strong convexity and application to statistical imaging
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- Local linear convergence of a primal-dual algorithm for the augmented convex models
- Active-set Newton methods and partial smoothness
- Douglas-Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms
- Partial smoothness and constant rank
- Convergence rates of forward-Douglas-Rachford splitting method
This page was built for publication: Activity identification and local linear convergence of Douglas-Rachford/ADMM under partial smoothness
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3300345)