An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
From MaRDI portal
Publication:360475
DOI10.1007/S11590-012-0491-7zbMATH Open1276.90047OpenAlexW1994949122MaRDI QIDQ360475FDOQ360475
Authors: Luc Pronzato, Elena Bukina, A. Zhigljavsky
Publication date: 27 August 2013
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-012-0491-7
Recommendations
- Gradient algorithms for quadratic optimization with fast convergence rates
- Studying convergence of gradient algorithms via optimal experimental design theory
- New stepsizes for the gradient method
- Near optimal step size and momentum in gradient descent for quadratic functions
- A Dynamical-System Analysis of the Optimum s-Gradient Algorithm
conjugate gradientquadratic optimizationFibonacci numbersarcsine distributionestimation of leading eigenvaluesgradient algorithms
Cites Work
- Title not available (Why is that?)
- Two-Point Step Size Gradient Methods
- The block preconditioned conjugate gradient method on vector computers
- Efficient and reliable iterative methods for linear systems
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- Studying convergence of gradient algorithms via optimal experimental design theory
- A Dynamical-System Analysis of the Optimum s-Gradient Algorithm
- Practical Use of Polynomial Preconditionings for the Conjugate Gradient Method
- Gradient algorithms for quadratic optimization with fast convergence rates
- Title not available (Why is that?)
Cited In (8)
- Studying convergence of gradient algorithms via optimal experimental design theory
- Performance analysis of greedy algorithms for minimising a maximum mean discrepancy
- On \(R\)-linear convergence analysis for a class of gradient methods
- On the steplength selection in gradient methods for unconstrained optimization
- Delayed gradient methods for symmetric and positive definite linear systems
- Gradient algorithms for quadratic optimization with fast convergence rates
- Spectral properties of Barzilai-Borwein rules in solving singly linearly constrained optimization problems subject to lower and upper bounds
- Estimation of spectral bounds in gradient algorithms
This page was built for publication: An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q360475)