Notes on the Dai-Yuan-Yuan modified spectral gradient method
DOI10.1016/J.CAM.2010.04.012zbMATH Open1195.65081OpenAlexW2073938719MaRDI QIDQ984907FDOQ984907
Authors: J. Martínez
Publication date: 20 July 2010
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2010.04.012
Recommendations
- A modified spectral conjugate gradient method with global convergence
- Two modified Dai-Yuan nonlinear conjugate gradient methods
- A new spectral gradient method for unconstrained optimization
- scientific article; zbMATH DE number 6179208
- Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization
global convergencenumerical experimentssecant equationquasi-Newton methodnonmonotone line searchspectral gradient method
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Methods of quasi-Newton type (90C53)
Cites Work
- CUTE
- Benchmarking optimization software with performance profiles.
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Gradient Method with Retards and Generalizations
- A limited memory BFGS-type method for large-scale unconstrained optimization
- New quasi-Newton equation and related methods for unconstrained optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Subspace Barzilai-Borwein gradient method for large-scale bound constrained optimization
- Nonmonotone globalization techniques for the Barzilai-Borwein gradient method
- On the nonmonotone line search
- Modified two-point stepsize gradient methods for unconstrained optimization
Cited In (16)
- A hybrid BB-type method for solving large scale unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Global convergence of a nonlinear conjugate gradient method
- An adaptive nonmonotone global Barzilai–Borwein gradient method for unconstrained optimization
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem
- Title not available (Why is that?)
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- A New Dai-Liao Conjugate Gradient Method based on Approximately Optimal Stepsize for Unconstrained Optimization
- New gradient methods with adaptive stepsizes by approximate models
- Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search
- A new adaptive Barzilai and Borwein method for unconstrained optimization
- Structured two-point stepsize gradient methods for nonlinear least squares
- The new spectral conjugate gradient method for large-scale unconstrained optimisation
- Scaling on the spectral gradient method
- An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
Uses Software
This page was built for publication: Notes on the Dai-Yuan-Yuan modified spectral gradient method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q984907)