Anderson acceleration of gradient methods with energy for optimization problems
From MaRDI portal
Publication:6575307
Recommendations
- Descent Properties of an Anderson Accelerated Gradient Method with Restarting
- Anderson acceleration with truncated Gram-Schmidt
- Convergence of a Constrained Vector Extrapolation Scheme
- On the asymptotic linear convergence speed of Anderson acceleration applied to ADMM
- Anderson acceleration of the extragradient method for the nonlinear complementarity problems
Cites work
- A Proof That Anderson Acceleration Improves the Convergence Rate in Linearly Converging Fixed-Point Methods (But Not in Those Converging Quadratically)
- A characterization of the behavior of the Anderson acceleration on linear problems
- A comparative study on methods for convergence acceleration of iterative vector sequences
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- A fixed-point iteration method for high frequency Helmholtz equations
- An adaptive gradient method with energy and momentum
- An analysis for the DIIS acceleration method used in quantum chemistry calculations
- Anderson Accelerated Douglas--Rachford Splitting
- Anderson accelerated fixed-stress splitting schemes for consolidation of unsaturated porous media
- Anderson acceleration for a class of nonsmooth fixed-point problems
- Anderson acceleration for fixed-point iterations
- Convergence analysis for Anderson acceleration
- Convergence of the EDIIS algorithm for nonlinear equations
- Extrapolation Methods for Vector Sequences
- Globally Convergent Type-I Anderson Acceleration for Nonsmooth Fixed-Point Iterations
- Iterative Procedures for Nonlinear Integral Equations
- Krylov Subspace Acceleration of Nonlinear Multigrid with Application to Recirculating Flows
- Local improvement results for Anderson acceleration with inaccurate function evaluations
- Nonlinear acceleration of momentum and primal-dual algorithms
- On the asymptotic linear convergence speed of Anderson acceleration applied to ADMM
- On the asymptotic linear convergence speed of Anderson acceleration, Nesterov acceleration, and nonlinear GMRES
- SGEM: stochastic gradient with energy and momentum
- Some methods of speeding up the convergence of iteration methods
- Two classes of multisecant methods for nonlinear acceleration
This page was built for publication: Anderson acceleration of gradient methods with energy for optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6575307)