Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
From MaRDI portal
Publication:2290291
DOI10.1016/j.apm.2017.02.008zbMath1446.65031OpenAlexW2587634790MaRDI QIDQ2290291
Gong Lin Yuan, Xi-wen Lu, Zeng-xin Wei
Publication date: 27 January 2020
Published in: Applied Mathematical Modelling (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.apm.2017.02.008
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26)
Related Items (56)
A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm ⋮ Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems ⋮ A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations ⋮ A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems ⋮ Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A new adaptive trust region algorithm for optimization problems ⋮ A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems ⋮ New hybrid conjugate gradient method as a convex combination of LS and FR methods ⋮ A new descent algorithm using the three-step discretization method for solving unconstrained optimization problems ⋮ The global convergence of the BFGS method with a modified WWP line search for nonconvex functions ⋮ Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ Global convergence of a modified Broyden family method for nonconvex functions ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ A double parameter scaled BFGS method for unconstrained optimization ⋮ The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique ⋮ A modified conjugate gradient method based on the self-scaling memoryless BFGS update ⋮ An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems ⋮ Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions ⋮ An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization ⋮ Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ A hybrid conjugate gradient algorithm for nonconvex functions and its applications in image restoration problems ⋮ A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems ⋮ Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization ⋮ An adaptive projection BFGS method for nonconvex unconstrained optimization problems ⋮ A modified secant equation quasi-Newton method for unconstrained optimization ⋮ Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems ⋮ An improved three-term derivative-free method for solving nonlinear equations ⋮ A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models ⋮ A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems ⋮ A new modified BFGS method for unconstrained optimization problems ⋮ A modified three-term PRP conjugate gradient algorithm for optimization models ⋮ A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems ⋮ A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing ⋮ Convergence analysis of an improved BFGS method and its application in the Muskingum model ⋮ A modified three-term type CD conjugate gradient algorithm for unconstrained optimization problems ⋮ Semi-parametric estimation of multivariate extreme expectiles ⋮ Spectral modified Polak-Ribiére-Polyak projection conjugate gradient method for solving monotone systems of nonlinear equations ⋮ The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems ⋮ A conjugate gradient algorithm and its applications in image restoration ⋮ Adaptive scaling damped BFGS method without gradient Lipschitz continuity ⋮ A tensor trust-region model for nonlinear system ⋮ The global proof of the Polak-Ribière-Polak algorithm under the YWL inexact line search technique ⋮ A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration ⋮ The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions ⋮ A limited memory BFGS subspace algorithm for bound constrained nonsmooth problems ⋮ A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization ⋮ A globally convergent BFGS method for symmetric nonlinear equations ⋮ A new type of quasi-Newton updating formulas based on the new quasi-Newton equation ⋮ A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems ⋮ A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method ⋮ A class of line search-type methods for nonsmooth convex regularized minimization ⋮ The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems ⋮ A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- Convergence analysis of a modified BFGS method on convex minimizations
- A conjugate gradient method with descent direction for unconstrained optimization
- The convergence properties of some new conjugate gradient methods
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- A modified PRP conjugate gradient method
- An efficient line search for nonlinear least squares
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- New quasi-Newton equation and related methods for unconstrained optimization
- An SQP-type method and its application in stochastic programs
- The BFGS method with exact line searches fails for non-convex objective functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Local convergence analysis for partitioned quasi-Newton updates
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Convergence Properties of Algorithms for Nonlinear Optimization
- Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
- Algorithm 851
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Variable Metric Method for Minimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Quasi-Newton Methods, Motivation and Theory
- CUTE
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Rapidly Convergent Descent Method for Minimization
- Function minimization by conjugate gradients
- CUTEr and SifDec
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- On the Convergence of the Variable Metric Algorithm
- Conditioning of Quasi-Newton Methods for Function Minimization
- A New Algorithm for Unconstrained Optimization
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search