A new stepsize for the steepest descent method
From MaRDI portal
Publication:5488734
zbMATH Open1101.65067MaRDI QIDQ5488734FDOQ5488734
Authors: Yaxiang Yuan
Publication date: 22 September 2006
Recommendations
convergencenumerical resultsalgorithmunconstrained optimizationline searchlarge scale problemsstepsize selection
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Cited In (55)
- New gradient methods with adaptive stepsizes by approximate models
- Several kinds of acceleration techniques for unconstrained optimization first-order algorithms
- Tikhonov regularization for a general nonlinear constrained optimization problem
- A new step length in a projection and contraction method
- A modified limited memory steepest descent method motivated by an inexact super-linear convergence rate analysis
- On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems
- The convergence of conjugate gradient method with nonmonotone line search
- Steplengths in the extragradient type methods
- An efficient gradient method using the Yuan steplength
- On the asymptotic convergence and acceleration of gradient methods
- The steepest descent algorithm without line search for \(p\)-Laplacian
- A new steepest descent method with global convergence properties
- Training GANs with centripetal acceleration
- A limited memory steepest descent method
- Steepest descent method with random step lengths
- Using gradient directions to get global convergence of Newton-type methods
- Cyclic gradient methods for unconstrained optimization
- Delayed weighted gradient method with simultaneous step-sizes for strongly convex optimization
- A modified steepest descent scheme for solving a class of parameter identification problems
- New stepsizes for the gradient method
- Accelerated multiple step-size methods for solving unconstrained optimization problems
- On the Preconditioned Delayed Weighted Gradient Method
- Analysis of the Barzilai-Borwein step-sizes for problems in Hilbert spaces
- A new descent algorithm using the three-step discretization method for solving unconstrained optimization problems
- A new adaptive Barzilai and Borwein method for unconstrained optimization
- A delayed weighted gradient method for strictly convex quadratic minimization
- A new modified Barzilai-Borwein gradient method for the quadratic minimization problem
- On the steplength selection in gradient methods for unconstrained optimization
- Partial spectral projected gradient method with active-set strategy for linearly constrained optimization
- A joint matrix minimization approach for multi-image face recognition
- Two-Point Step Size Gradient Methods
- Diagonal BFGS updates and applications to the limited memory BFGS method
- On the acceleration of the Barzilai-Borwein method
- A novel of step size selection procedures for steepest descent method
- A new steplength selection for scaled gradient methods with application to image deblurring
- An overview of nonlinear optimization
- Gravity-magnetic cross-gradient joint inversion by the cyclic gradient method
- Delayed gradient methods for symmetric and positive definite linear systems
- Overlooked Branch Cut in Steepest Descent Method: Switching Line and Atomic Domain
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- A new conjugate gradient algorithm with cubic Barzilai-Borwein stepsize for unconstrained optimization
- Equipping the Barzilai-Borwein method with the two dimensional quadratic termination property
- On the steepest descent algorithm for quadratic functions
- A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization
- Fast gradient methods with alignment for symmetric linear systems without using Cauchy step
- A dynamical Tikhonov regularization for solving ill-posed linear algebraic systems
- Scaling on the spectral gradient method
- A gradient method exploiting the two dimensional quadratic termination property
- Variations of the steepest descent method in nonrestricted optimization
- A survey of gradient methods for solving nonlinear optimization
- A new search procedure of steepest ascent in response surface exploration
- Monotone projected gradient methods for large-scale box-constrained quadratic programming
- A new gradient method via quasi-Cauchy relation which guarantees descent
- A two-phase gradient method for quadratic programming problems with a single linear constraint and bounds on the variables
- Gradient methods exploiting spectral properties
This page was built for publication: A new stepsize for the steepest descent method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5488734)