A nonlinear conjugate gradient method based on the MBFGS secant condition
From MaRDI portal
Publication:3423590
DOI10.1080/10556780500137041zbMath1112.90096OpenAlexW2070617022MaRDI QIDQ3423590
Publication date: 14 February 2007
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556780500137041
Related Items
On Hager and Zhang's conjugate gradient method with guaranteed descent ⋮ Sufficient descent conjugate gradient methods for large-scale optimization problems ⋮ Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update ⋮ Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing ⋮ New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters ⋮ A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family ⋮ A modified conjugate gradient method based on a modified secant equation ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix ⋮ On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae ⋮ An efficient modified residual-based algorithm for large scale symmetric nonlinear equations by approximating successive iterated gradients ⋮ A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations ⋮ A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence ⋮ An adaptive modified three-term conjugate gradient method with global convergence ⋮ A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model ⋮ On a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applications ⋮ Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition ⋮ Descent Perry conjugate gradient methods for systems of monotone nonlinear equations ⋮ Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization ⋮ Globally convergent modified Perry's conjugate gradient method ⋮ Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization ⋮ Two effective hybrid conjugate gradient algorithms based on modified BFGS updates ⋮ A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function ⋮ Unnamed Item ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ An improved nonlinear conjugate gradient method with an optimal property ⋮ Two modified three-term conjugate gradient methods with sufficient descent property ⋮ A novel value for the parameter in the Dai-Liao-type conjugate gradient method ⋮ An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ Conjugate gradient methods using value of objective function for unconstrained optimization ⋮ Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length ⋮ Two new conjugate gradient methods based on modified secant equations ⋮ Some nonlinear conjugate gradient methods based on spectral scaling secant equations ⋮ A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization ⋮ A descent family of Dai–Liao conjugate gradient methods ⋮ Some descent three-term conjugate gradient methods and their global convergence ⋮ An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing ⋮ The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices ⋮ A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A spectral three-term Hestenes-Stiefel conjugate gradient method ⋮ A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization ⋮ A conjugate gradient method with sufficient descent property ⋮ Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations ⋮ Two descent Dai-Yuan conjugate gradient methods for systems of monotone nonlinear equations ⋮ A new accelerated conjugate gradient method for large-scale unconstrained optimization ⋮ A modified spectral conjugate gradient method with global convergence ⋮ An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems ⋮ A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing ⋮ An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix ⋮ A modified two-point stepsize gradient algorithm for unconstrained minimization ⋮ A hybrid conjugate gradient method based on a quadratic relaxation of the Dai–Yuan hybrid conjugate gradient parameter
Cites Work
- Unnamed Item
- Unnamed Item
- New quasi-Newton equation and related methods for unconstrained optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Testing Unconstrained Optimization Software
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- Convergence Properties of the BFGS Algoritm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations