On the asymptotic directions of the s-dimensional optimum gradient method
From MaRDI portal
Publication:2526455
DOI10.1007/BF02165472zbMath0153.46004OpenAlexW2055018991MaRDI QIDQ2526455
Publication date: 1968
Published in: Numerische Mathematik (Search for Journal in Brave)
Full work available at URL: https://eudml.org/doc/131809
Related Items (34)
On initial point selection of the steepest descent algorithm for general quadratic functions ⋮ Projection-minimization methods for nonsymmetric linear systems ⋮ Properties of the sequential gradient-restoration algorithm (SGRA). II: Convergence analysis ⋮ On the worst case performance of the steepest descent algorithm for quadratic functions ⋮ Steepest descent method with random step lengths ⋮ s-step iterative methods for symmetric linear systems ⋮ Estimation of spectral bounds in gradient algorithms ⋮ Delayed Gradient Methods for Symmetric and Positive Definite Linear Systems ⋮ An approach to backward analysis for linear and nonlinear iterative methods ⋮ On the Forsythe conjecture ⋮ Gradient algorithms for quadratic optimization with fast convergence rates ⋮ A gradient method exploiting the two dimensional quadratic termination property ⋮ Fast gradient methods with alignment for symmetric linear systems without using Cauchy step ⋮ Computational methods of linear algebra ⋮ An overview of nonlinear optimization ⋮ GMRES algorithms over 35 years ⋮ The method of conjugate gradients used in inverse iteration ⋮ Gradient methods exploiting spectral properties ⋮ A short note on the Q-linear convergence of the steepest descent method ⋮ Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces ⋮ Scalar correction method for finding least-squares solutions on Hilbert spaces and its applications ⋮ Conjugate gradient method for computing the Moore-Penrose inverse and rank of a matrix ⋮ Gradient descent and fast artificial time integration ⋮ Approximation methods for the unconstrained optimization ⋮ Accelerated Gauss-Newton algorithms for nonlinear least squares problems ⋮ A globally optimal tri-vector method to solve an ill-posed linear system ⋮ On the steepest descent algorithm for quadratic functions ⋮ On the asymptotic convergence and acceleration of gradient methods ⋮ Level set of the asymptotic rate of convergence for the method of steepest descent ⋮ Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property ⋮ A modified two-point stepsize gradient algorithm for unconstrained minimization ⋮ An optimal tri-vector iterative algorithm for solving ill-posed linear inverse problems ⋮ On the acceleration of the Barzilai-Borwein method ⋮ Some remarks on the method of minimal residues
Cites Work
- Unnamed Item
- Unnamed Item
- Moments and characteristic roots
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- Über eine Verallgemeinerung einer Ungleichung von L.V. Kantorowitsch
- Über einige Methoden der Relaxationsrechnung
- An Iterative Least-Square Method Suitable for Solving Large Sparse Matrices
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- Solving linear algebraic equations can be interesting
This page was built for publication: On the asymptotic directions of the s-dimensional optimum gradient method