Gradient algorithms for quadratic optimization with fast convergence rates
From MaRDI portal
Recommendations
- An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
- Studying convergence of gradient algorithms via optimal experimental design theory
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- A Dynamical-System Analysis of the Optimum s-Gradient Algorithm
- Iterative minimization of quadratic functionals
Cites work
- scientific article; zbMATH DE number 3466802 (Why is no real title available?)
- scientific article; zbMATH DE number 1346777 (Why is no real title available?)
- scientific article; zbMATH DE number 3440485 (Why is no real title available?)
- scientific article; zbMATH DE number 1857673 (Why is no real title available?)
- A Dynamical-System Analysis of the Optimum s-Gradient Algorithm
- A new gradient method with an optimal stepsize property
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- On the asymptotic directions of the s-dimensional optimum gradient method
- Renormalised steepest descent in Hilbert space converges to a two-point attractor.
- Studying convergence of gradient algorithms via optimal experimental design theory
- Topics in Advanced Econometrics
- Two-Point Step Size Gradient Methods
Cited in
(14)- Delayed gradient methods for symmetric and positive definite linear systems
- Level set of the asymptotic rate of convergence for the method of steepest descent
- Near optimal step size and momentum in gradient descent for quadratic functions
- Fibonacci scheme for quadratic functionals
- An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
- A second-order gradient method for convex minimization
- Estimation of spectral bounds in gradient algorithms
- A gradient method exploiting the two dimensional quadratic termination property
- Studying convergence of gradient algorithms via optimal experimental design theory
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- On the steplength selection in gradient methods for unconstrained optimization
- Steepest descent method with random step lengths
- scientific article; zbMATH DE number 4139497 (Why is no real title available?)
- scientific article; zbMATH DE number 3844940 (Why is no real title available?)
This page was built for publication: Gradient algorithms for quadratic optimization with fast convergence rates
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q409265)