A Positive Barzilai–Borwein-Like Stepsize and an Extension for Symmetric Linear Systems
From MaRDI portal
Publication:3462304
DOI10.1007/978-3-319-17689-5_3zbMath1330.65084OpenAlexW2271321931MaRDI QIDQ3462304
Yu-Hong Dai, Xiao Qi Yang, Mehiddin Al-Baali
Publication date: 5 January 2016
Published in: Numerical Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-319-17689-5_3
unconstrained optimizationnumerical examplecondition numberquadratic functionstepsize control\(R\)-superlinear convergenceBarzilai and Borwein gradient method
Related Items (18)
On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems ⋮ A positive spectral gradient-like method for large-scale nonlinear monotone equations ⋮ A new nonmonotone spectral residual method for nonsmooth nonlinear equations ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A derivative-free multivariate spectral projection algorithm for constrained nonlinear monotone equations ⋮ A framework for convex-constrained monotone nonlinear equations and its special cases ⋮ A new inertial-based method for solving pseudomonotone operator equations with application ⋮ Adjoint-based optimal control of contractile elastic bodies. Application to limbless locomotion on frictional substrates ⋮ A Barzilai-Borwein descent method for multiobjective optimization problems ⋮ An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions ⋮ Memory gradient method for multiobjective optimization ⋮ Gradient methods exploiting spectral properties ⋮ Stochastic variance reduced gradient methods using a trust-region-like scheme ⋮ Unnamed Item ⋮ On \(R\)-linear convergence analysis for a class of gradient methods ⋮ A family of spectral gradient methods for optimization ⋮ Variable metric proximal stochastic variance reduced gradient methods for nonconvex nonsmooth optimization ⋮ NEW ADAPTIVE BARZILAI–BORWEIN STEP SIZE AND ITS APPLICATION IN SOLVING LARGE-SCALE OPTIMIZATION PROBLEMS
Cites Work
- Unnamed Item
- A new analysis on the Barzilai-Borwein gradient method
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- On the asymptotic behaviour of some new gradient methods
- New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
- A new gradient method with an optimal stepsize property
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Inexact and Preconditioned Uzawa Algorithms for Saddle Point Problems
- Alternate step gradient method*
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Sparse Reconstruction by Separable Approximation
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Barzilai and Borwein choice of steplength for the gradient method
- Spectral residual method without gradient information for solving large-scale nonlinear systems of equations
This page was built for publication: A Positive Barzilai–Borwein-Like Stepsize and an Extension for Symmetric Linear Systems