On the asymptotic linear convergence speed of Anderson acceleration applied to ADMM
From MaRDI portal
Publication:2049085
DOI10.1007/s10915-021-01548-2zbMath1487.65004arXiv2007.02916OpenAlexW3173679549MaRDI QIDQ2049085
Yunhui He, Da-Wei Wang, Hans De Sterck
Publication date: 24 August 2021
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2007.02916
Numerical optimization and variational techniques (65K10) Learning and adaptive systems in artificial intelligence (68T05) Extrapolation to the limit, deferred corrections (65B05)
Related Items (11)
Linear Asymptotic Convergence of Anderson Acceleration: Fixed-Point Analysis ⋮ On an improved PDE-based elliptic parameterization method for isogeometric analysis using preconditioned Anderson acceleration ⋮ Composite Anderson acceleration method with two window sizes and optimized damping ⋮ The effect of Anderson acceleration on superlinear and sublinear convergence ⋮ Fast gradient method for low-rank matrix estimation ⋮ Filtering for Anderson Acceleration ⋮ Newton-Anderson at Singular Points ⋮ Nonmonotone globalization for Anderson acceleration via adaptive regularization ⋮ Asymptotic convergence analysis and influence of initial guesses on composite Anderson acceleration ⋮ Descent Properties of an Anderson Accelerated Gradient Method with Restarting ⋮ Anderson acceleration as a Krylov method with application to convergence analysis
Uses Software
Cites Work
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- On the linear convergence of the alternating direction method of multipliers
- Parallel alternating direction multiplier decomposition of convex programs
- On non-ergodic convergence rate of Douglas-Rachford alternating direction method of multipliers
- On the global and linear convergence of the generalized alternating direction method of multipliers
- Nonlinearly Preconditioned Optimization on Grassmann Manifolds for Computing Approximate Tucker Tensor Decompositions
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- A Nonlinear GMRES Optimization Algorithm for Canonical Tensor Decomposition
- Optimal Parameter Selection for the Alternating Direction Method of Multipliers (ADMM): Quadratic Problems
- Anderson Acceleration for Fixed-Point Iterations
- Nonlinearly preconditioned L-BFGS as an acceleration mechanism for alternating least squares with application to tensor decomposition
- A nonlinearly preconditioned conjugate gradient algorithm for rank‐R canonical tensor approximation
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- GMRES-Accelerated ADMM for Quadratic Objectives
- Shanks Sequence Transformations and Anderson Acceleration
- Iterative Solution of Nonlinear Equations in Several Variables
- Nesterov acceleration of alternating least squares for canonical tensor decomposition: Momentum step size selection and restart mechanisms
- Anderson Accelerated Douglas--Rachford Splitting
- Fast Alternating Direction Optimization Methods
- Convergence Analysis for Anderson Acceleration
- Faster Convergence Rates of Relaxed Peaceman-Rachford and ADMM Under Regularity Assumptions
- Iterative Procedures for Nonlinear Integral Equations
This page was built for publication: On the asymptotic linear convergence speed of Anderson acceleration applied to ADMM