Activity Identification and Local Linear Convergence of Douglas–Rachford/ADMM under Partial Smoothness
From MaRDI portal
Publication:3300345
DOI10.1007/978-3-319-18461-6_51zbMath1444.90091arXiv1412.6858OpenAlexW2220926539MaRDI QIDQ3300345
Gabriel Peyré, Jalal Fadili, Russell Luke, Jingwei Liang
Publication date: 28 July 2020
Published in: Lecture Notes in Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1412.6858
ADMMDouglas-Rachford splittinglocal linear convergencepartial smoothnessfinite activity identification
Related Items (11)
Convergence rates with inexact non-expansive operators ⋮ Local Linear Convergence of the ADMM/Douglas--Rachford Algorithms without Strong Convexity and Application to Statistical Imaging ⋮ Local convergence properties of Douglas-Rachford and alternating direction method of multipliers ⋮ On the global and linear convergence of the generalized alternating direction method of multipliers ⋮ Local linear convergence of a primal-dual algorithm for the augmented convex models ⋮ The degrees of freedom of partly smooth regularizers ⋮ A weakly convergent fully inexact Douglas-Rachford method with relative error tolerance ⋮ On the Finite Convergence of the Douglas--Rachford Algorithm for Solving (Not Necessarily Convex) Feasibility Problems in Euclidean Spaces ⋮ Local linear convergence analysis of Primal–Dual splitting methods ⋮ On the interplay between acceleration and identification for the proximal gradient algorithm ⋮ Active‐Set Newton Methods and Partial Smoothness
Cites Work
- Unnamed Item
- Unnamed Item
- Local linear convergence for alternating and averaged nonconvex projections
- A variational approach to remove outliers and impulse noise
- A first-order primal-dual algorithm for convex problems with applications to imaging
- The rate of linear convergence of the Douglas-Rachford algorithm for subspaces is the cosine of the Friedrichs angle
- A Generalized Forward-Backward Splitting
- The Douglas–Rachford Algorithm in the Absence of Convexity
- Orthogonal Invariance and Identifiability
- Linear Convergence and Metric Selection for Douglas-Rachford Splitting and ADMM
- Eventual linear convergence of the Douglas-Rachford iteration for basis pursuit
- A proximal decomposition method for solving convex variational inverse problems
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Variational Analysis
- Alternating Projections and Douglas-Rachford for Sparse Affine Feasibility
- Model selection with low complexity priors
- Active Sets, Nonsmoothness, and Sensitivity
- Faster Convergence Rates of Relaxed Peaceman-Rachford and ADMM Under Regularity Assumptions
- Local Linear Convergence of the Alternating Direction Method of Multipliers on Quadratic or Linear Programs
- Nonconvex Notions of Regularity and Convergence of Fundamental Algorithms for Feasibility Problems
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: Activity Identification and Local Linear Convergence of Douglas–Rachford/ADMM under Partial Smoothness