Convergence analysis of a modified BFGS method on convex minimizations
From MaRDI portal
Publication:711385
DOI10.1007/s10589-008-9219-0zbMath1228.90077OpenAlexW2000886126MaRDI QIDQ711385
Publication date: 26 October 2010
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-008-9219-0
Related Items (52)
A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm ⋮ A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems ⋮ A new adaptive trust region algorithm for optimization problems ⋮ The global convergence of the BFGS method with a modified WWP line search for nonconvex functions ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization ⋮ A modified nonmonotone BFGS algorithm for unconstrained optimization ⋮ The global convergence of a modified BFGS method for nonconvex functions ⋮ Global convergence of a modified Broyden family method for nonconvex functions ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ Limited memory BFGS algorithm for the matrix approximation problem in Frobenius norm ⋮ A double parameter scaled BFGS method for unconstrained optimization ⋮ An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems ⋮ A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems ⋮ An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization ⋮ Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions ⋮ The projection technique for two open problems of unconstrained optimization problems ⋮ A modified stochastic quasi-Newton algorithm for summing functions problem in machine learning ⋮ A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems ⋮ Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization ⋮ An adaptive projection BFGS method for nonconvex unconstrained optimization problems ⋮ Two-step conjugate gradient method for unconstrained optimization ⋮ An active set limited memory BFGS algorithm for bound constrained optimization ⋮ A modified secant equation quasi-Newton method for unconstrained optimization ⋮ Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems ⋮ A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models ⋮ A Modified Nonmonotone Hestenes–Stiefel Type Conjugate Gradient Methods for Large-Scale Unconstrained Problems ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing ⋮ A quasi-Newton algorithm for large-scale nonlinear equations ⋮ A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems ⋮ A new modified BFGS method for unconstrained optimization problems ⋮ A BFGS algorithm for solving symmetric nonlinear equations ⋮ A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs ⋮ A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing ⋮ Convergence analysis of an improved BFGS method and its application in the Muskingum model ⋮ Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization ⋮ A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems ⋮ Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems ⋮ Unnamed Item ⋮ An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing ⋮ A conjugate gradient method with descent direction for unconstrained optimization ⋮ Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search ⋮ A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization ⋮ A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems ⋮ A tensor trust-region model for nonlinear system ⋮ A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration ⋮ Scaled nonlinear conjugate gradient methods for nonlinear least squares problems ⋮ A new type of quasi-Newton updating formulas based on the new quasi-Newton equation ⋮ Unnamed Item ⋮ A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization ⋮ The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Differential optimization techniques
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- Global convergence analysis of a new nonmonotone BFGS algorithm on convex objective functions
- A note on minimization problems and multistep methods
- An SQP-type method and its application in stochastic programs
- The BFGS method with exact line searches fails for non-convex objective functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Local convergence analysis for partitioned quasi-Newton updates
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- New quasi-Newton methods for unconstrained optimization problems
- New line search methods for unconstrained optimization
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
- The “global” convergence of Broyden-like methods with suitable line search
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Quasi-Newton Methods, Motivation and Theory
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:
- One-step and multistep procedures for constrained minimization problems
- CUTEr and SifDec
- Variable metric methods of minimisation
- On the Convergence of the Variable Metric Algorithm
- A New Algorithm for Unconstrained Optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Convergence analysis of a modified BFGS method on convex minimizations