Anderson Accelerated Douglas--Rachford Splitting
DOI10.1137/19M1290097zbMath1458.90511arXiv1908.11482MaRDI QIDQ5146687
Anqi Fu, Junzi Zhang, Stephen P. Boyd
Publication date: 26 January 2021
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1908.11482
global convergencestabilizationnonsmooth convex optimizationAnderson accelerationparallel and distributed optimizationpathological settingsproximal oracles
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonsmooth analysis (49J52) Methods of quasi-Newton type (90C53) Parallel algorithms in computer science (68W10) Distributed algorithms (68W15) Mathematical software, computer programs (educational aspects) (97N80)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Sparse inverse covariance estimation with the graphical lasso
- Conic optimization via operator splitting and homogeneous self-dual embedding
- An analysis for the DIIS acceleration method used in quantum chemistry calculations
- Block splitting for distributed optimization
- Convergence acceleration for the iterative solution of the equations X = AX + f
- Introductory lectures on convex optimization. A basic course.
- Alternating direction method with self-adaptive penalty parameters for monotone variational inequalities
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- Douglas-Rachford splitting and ADMM for pathological convex optimization
- A new use of Douglas-Rachford splitting for identifying infeasible, unbounded, and pathological conic programs
- CVXPY: A Python-Embedded Modeling Language for Convex Optimization
- Considerations on the Implementation and Use of Anderson Acceleration on Distributed Memory and GPU-based Parallel Computers
- Two classes of multisecant methods for nonlinear acceleration
- Anderson Acceleration for Fixed-Point Iterations
- Dynamic Energy Management
- $\ell_1$ Trend Filtering
- Acceleration of vector sequences by multi-dimensional Δ2 methods
- Extrapolation Methods for Vector Sequences
- LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares
- Distributed Linearized Alternating Direction Method of Multipliers for Composite Convex Consensus Optimization
- Shanks Sequence Transformations and Anderson Acceleration
- Convergence of the EDIIS Algorithm for Nonlinear Equations
- Parameter Selection and Preconditioning for a Graph Form Solver
- SuperMann: A Superlinearly Convergent Algorithm for Finding Fixed Points of Nonexpansive Operators
- A Proof That Anderson Acceleration Improves the Convergence Rate in Linearly Converging Fixed-Point Methods (But Not in Those Converging Quadratically)
- A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization
- Convergence Analysis for Anderson Acceleration
- Local Improvement Results for Anderson Acceleration with Inaccurate Function Evaluations
- Iterative Procedures for Nonlinear Integral Equations
- On a Device for Computing the e m (S n ) Transformation
- A distributed Douglas-Rachford splitting method for multi-block convex minimization problems