Descent Properties of an Anderson Accelerated Gradient Method with Restarting
From MaRDI portal
Publication:6188506
DOI10.1137/22m151460xarXiv2206.01372OpenAlexW4391029681MaRDI QIDQ6188506
Wenqing Ouyang, Andre Milzarek, Unnamed Author
Publication date: 7 February 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2206.01372
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Damped Anderson Acceleration With Restarts and Monotonicity Control for Accelerating EM and EM-like Algorithms
- An analysis for the DIIS acceleration method used in quantum chemistry calculations
- Robust inversion, dimensionality reduction, and randomized sampling
- Lectures on convex optimization
- Relaxationsmethoden bester Strategie zur Lösung linearer Gleichungssysteme
- A comparative study on methods for convergence acceleration of iterative vector sequences
- A characterization of the behavior of the Anderson acceleration on linear problems
- On the asymptotic linear convergence speed of Anderson acceleration applied to ADMM
- Fast and stable nonconvex constrained distributed optimization: the ELLADA algorithm
- A Brief Introduction to Krylov Space Methods for Solving Linear Systems
- Considerations on the Implementation and Use of Anderson Acceleration on Distributed Memory and GPU-based Parallel Computers
- Two classes of multisecant methods for nonlinear acceleration
- Anderson Acceleration for Fixed-Point Iterations
- GMRES: A Generalized Minimal Residual Algorithm for Solving Nonsymmetric Linear Systems
- LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares
- On the Convergence of Algorithms with Restart
- Restart procedures for the conjugate gradient method
- A Newton basis GMRES implementation
- Anderson-Accelerated Convergence of Picard Iterations for Incompressible Navier--Stokes Equations
- Convergence analysis of adaptive DIIS algorithms with application to electronic ground state calculations
- Anderson acceleration for contractive and noncontractive operators
- Globally Convergent Type-I Anderson Acceleration for Nonsmooth Fixed-Point Iterations
- Anderson Accelerated Douglas--Rachford Splitting
- Anderson Acceleration for a Class of Nonsmooth Fixed-Point Problems
- A Proof That Anderson Acceleration Improves the Convergence Rate in Linearly Converging Fixed-Point Methods (But Not in Those Converging Quadratically)
- Convergence Analysis for Anderson Acceleration
- Iterative Procedures for Nonlinear Integral Equations
- Methods of conjugate gradients for solving linear systems
This page was built for publication: Descent Properties of an Anderson Accelerated Gradient Method with Restarting