nlTGCR: A Class of Nonlinear Acceleration Procedures Based on Conjugate Residuals
From MaRDI portal
Publication:6130649
DOI10.1137/23m1576360arXiv2306.00325MaRDI QIDQ6130649
Huan He, Unnamed Author, Yousef Saad, Yuanzhe Xi, Shifan Zhao
Publication date: 3 April 2024
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2306.00325
Newton's methodAnderson accelerationdeep learninggeneralized conjugate residualnonlinear accelerationtruncated GCR
Artificial neural networks and deep learning (68T07) Numerical optimization and variational techniques (65K10) Methods of quasi-Newton type (90C53) Iterative numerical methods for linear systems (65F10) Approximation algorithms (68W25) Preconditioners for iterative methods (65F08)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Relaxationsmethoden bester Strategie zur Lösung linearer Gleichungssysteme
- On the limited memory BFGS method for large scale optimization
- Conjugate gradient type methods for unsymmetric and inconsistent systems of linear equations
- Generalized conjugate-gradient acceleration of nonsymmetrizable iterative methods
- A comparative study on methods for convergence acceleration of iterative vector sequences
- On the accurate discretization of a highly nonlinear boundary value problem
- Secant update generalized version of PSB: a new approach
- Secant update version of quasi-Newton PSB with weighted multisecant equations
- Minimization of functions having Lipschitz continuous first partial derivatives
- Accelerated block preconditioned gradient method for large scale wave functions calculations in density functional theory
- Elliptic Preconditioner for Accelerating the Self-Consistent Field Iteration in Kohn--Sham Density Functional Theory
- The Simplified Topological $\varepsilon$-Algorithms for Accelerating Sequences in a Vector Space
- Numerical Methods for Large Eigenvalue Problems
- Two classes of multisecant methods for nonlinear acceleration
- Variational Iterative Methods for Nonsymmetric Systems of Linear Equations
- Anderson Acceleration for Fixed-Point Iterations
- Hybrid Krylov Methods for Nonlinear Systems of Equations
- Extrapolation Methods for Vector Sequences
- Inexact Newton Methods
- A Polynomial Extrapolation Method for Finding Limits and Antilimits of Vector Sequences
- Convergence Theory of Nonlinear Newton–Krylov Algorithms
- Globally Convergent Inexact Newton Methods
- Solving Nonlinear Equations with Newton's Method
- Krylov Subspace Acceleration of Nonlinear Multigrid with Application to Recirculating Flows
- Shanks Sequence Transformations and Anderson Acceleration
- Anderson-Accelerated Convergence of Picard Iterations for Incompressible Navier--Stokes Equations
- The Mathematics of Financial Derivatives
- Choosing the Forcing Terms in an Inexact Newton Method
- LU implementation of the modified minimal polynomial extrapolation method for solving linear and nonlinear systems
- Krylov Methods for Nonsymmetric Linear Systems
- Convergence analysis of adaptive DIIS algorithms with application to electronic ground state calculations
- Shanks and Anderson-type acceleration techniques for systems of nonlinear equations
- Anderson Acceleration for a Class of Nonsmooth Fixed-Point Problems
- A Proof That Anderson Acceleration Improves the Convergence Rate in Linearly Converging Fixed-Point Methods (But Not in Those Converging Quadratically)
- Convergence Analysis for Anderson Acceleration
- Function minimization by conjugate gradients
- Open-Channel Flow
- Iterative Procedures for Nonlinear Integral Equations
- A New Algorithm for Unconstrained Optimization
- A Stochastic Approximation Method