A new gradient method via least change secant update
From MaRDI portal
Publication:2995480
Recommendations
- On the acceleration of the Barzilai-Borwein method
- A new analysis on the Barzilai-Borwein gradient method
- Gradient methods with adaptive step-sizes
- A new modified Barzilai-Borwein gradient method for the quadratic minimization problem
- Equipping the Barzilai-Borwein method with the two dimensional quadratic termination property
Cites work
- scientific article; zbMATH DE number 3408799 (Why is no real title available?)
- scientific article; zbMATH DE number 2221955 (Why is no real title available?)
- A spectral conjugate gradient method for unconstrained optimization
- Alternate minimization gradient method
- Convergence Conditions for Ascent Methods
- Global convergence of conjugate gradient methods without line search
- Modified two-point stepsize gradient methods for unconstrained optimization
- On the Barzilai and Borwein choice of steplength for the gradient method
- Two-Point Step Size Gradient Methods
Cited in
(10)- An extended delayed weighted gradient algorithm for solving strongly convex optimization problems
- A New Class of Incremental Gradient Methods for Least Squares Problems
- Scaled memoryless symmetric rank one method for large-scale optimization
- A study of the Dennis-Wolkowicz method on convex functions
- Least-Change Secant Updates of Nonsquare Matrices
- Least-Change Sparse Secant Update Methods with Inaccurate Secant Conditions
- Least-Change Secant Update Methods for Underdetermined Systems
- A modification of the gradient method and function extremization
- A secant-based Nesterov method for convex functions
- A new gradient method via quasi-Cauchy relation which guarantees descent
This page was built for publication: A new gradient method via least change secant update
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2995480)