Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
From MaRDI portal
Publication:2406313
Recommendations
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- An efficient gradient method with approximately optimal stepsizes based on regularization models for unconstrained optimization
- Two novel gradient methods with optimal step sizes
- On the acceleration of the Barzilai-Borwein method
Cites work
- A Modified BFGS Algorithm for Unconstrained Optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- Modified two-point stepsize gradient methods for unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- New quasi-Newton methods via higher order tensor models
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Notes on the Dai-Yuan-Yuan modified spectral gradient method
- On spectral properties of steepest descent methods
- On the Barzilai and Borwein choice of steplength for the gradient method
- Scaling on the spectral gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
Cited in
(25)- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Two novel gradient methods with optimal step sizes
- New adaptive Barzilai-Borwein step size and its application in solving large-scale optimization problems
- An efficient gradient method with approximately optimal stepsizes based on regularization models for unconstrained optimization
- Efficient methods for convex problems with Bregman Barzilai-Borwein step sizes
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- Stochastic gradient method with Barzilai-Borwein step for unconstrained nonlinear optimization
- A New Dai-Liao Conjugate Gradient Method based on Approximately Optimal Stepsize for Unconstrained Optimization
- An extremal problem with applications to renewable energy production
- New gradient methods with adaptive stepsizes by approximate models
- Continuous-time gradient-like descent algorithm for constrained convex unknown functions: penalty method application
- Gradient methods for large scale convex quadratic functions
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- Accelerated augmented Lagrangian method for total variation minimization
- Structured two-point stepsize gradient methods for nonlinear least squares
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- The new spectral conjugate gradient method for large-scale unconstrained optimisation
- A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
- An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization
- A new two-step gradient-type method for large-scale unconstrained optimization
- scientific article; zbMATH DE number 5066295 (Why is no real title available?)
This page was built for publication: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2406313)