Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
From MaRDI portal
Publication:2406313
DOI10.1016/J.CAM.2017.07.035zbMATH Open1380.90256OpenAlexW2743352242MaRDI QIDQ2406313FDOQ2406313
Authors: Zexian Liu, Hongwei Liu
Publication date: 27 September 2017
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2017.07.035
Recommendations
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- An efficient gradient method with approximately optimal stepsizes based on regularization models for unconstrained optimization
- Two novel gradient methods with optimal step sizes
- On the acceleration of the Barzilai-Borwein method
quadratic modelgradient methodapproximate optimal stepsizeBarzilai-Borwein (BB) methodBFGS update formula
Cites Work
- Benchmarking optimization software with performance profiles.
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonmonotone Line Search Technique for Newton’s Method
- On spectral properties of steepest descent methods
- New quasi-Newton equation and related methods for unconstrained optimization
- Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization
- Notes on the Dai-Yuan-Yuan modified spectral gradient method
- A Modified BFGS Algorithm for Unconstrained Optimization
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
- On the Barzilai and Borwein choice of steplength for the gradient method
- Modified two-point stepsize gradient methods for unconstrained optimization
- Scaling on the spectral gradient method
- New quasi-Newton methods via higher order tensor models
Cited In (25)
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Two novel gradient methods with optimal step sizes
- An efficient gradient method with approximately optimal stepsizes based on regularization models for unconstrained optimization
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- Efficient methods for convex problems with Bregman Barzilai-Borwein step sizes
- New adaptive Barzilai-Borwein step size and its application in solving large-scale optimization problems
- A New Dai-Liao Conjugate Gradient Method based on Approximately Optimal Stepsize for Unconstrained Optimization
- Stochastic gradient method with Barzilai-Borwein step for unconstrained nonlinear optimization
- An extremal problem with applications to renewable energy production
- New gradient methods with adaptive stepsizes by approximate models
- Continuous-time gradient-like descent algorithm for constrained convex unknown functions: penalty method application
- Gradient methods for large scale convex quadratic functions
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- Accelerated augmented Lagrangian method for total variation minimization
- Structured two-point stepsize gradient methods for nonlinear least squares
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- The new spectral conjugate gradient method for large-scale unconstrained optimisation
- A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
- An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization
- Title not available (Why is that?)
- A new two-step gradient-type method for large-scale unconstrained optimization
This page was built for publication: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2406313)