Anderson Accelerated Douglas--Rachford Splitting

From MaRDI portal
Publication:5146687

DOI10.1137/19M1290097zbMath1458.90511arXiv1908.11482MaRDI QIDQ5146687

Anqi Fu, Junzi Zhang, Stephen P. Boyd

Publication date: 26 January 2021

Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1908.11482



Related Items

Eigen-stratified models, Convergence of a Constrained Vector Extrapolation Scheme, Anderson acceleration based on the \(\mathcal{H}^{- s}\) Sobolev norm for contractive and noncontractive fixed-point operators, Anderson accelerating the preconditioned modulus approach for linear complementarity problems on second-order cones, Anderson acceleration for a regularized Bingham model, The effect of Anderson acceleration on superlinear and sublinear convergence, Efficient and effective algebraic splitting‐based solvers for nonlinear saddle point problems, Sparse basis pursuit for compliance minimization in the vanishing volume ratio limit, Nonmonotone globalization for Anderson acceleration via adaptive regularization, Descent Properties of an Anderson Accelerated Gradient Method with Restarting, Anderson acceleration as a Krylov method with application to convergence analysis, Enabling convergence of the iterated penalty Picard iteration with \(O ( 1 )\) penalty parameter for incompressible Navier-Stokes via Anderson acceleration, On the asymptotic linear convergence speed of Anderson acceleration applied to ADMM, Risk-averse stochastic programming and distributionally robust optimization via operator splitting, Multiscale Hierarchical Image Decomposition and Refinements: Qualitative and Quantitative Results, Improved convergence of the Arrow-Hurwicz iteration for the Navier-Stokes equation via grad-div stabilization and Anderson acceleration, Large-Scale Optimization with Linear Equality Constraints Using Reduced Compact Representation


Uses Software


Cites Work