Convergence analysis of a modified BFGS method on convex minimizations
From MaRDI portal
Publication:711385
DOI10.1007/s10589-008-9219-0zbMath1228.90077MaRDI QIDQ711385
Publication date: 26 October 2010
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-008-9219-0
90C25: Convex programming
Related Items
A BFGS algorithm for solving symmetric nonlinear equations, A Conjugate Gradient Method Based on a Modified Secant Relation for Unconstrained Optimization, A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems, A quasi-Newton algorithm for large-scale nonlinear equations, An active set limited memory BFGS algorithm for bound constrained optimization, A conjugate gradient method with descent direction for unconstrained optimization, A new type of quasi-Newton updating formulas based on the new quasi-Newton equation, A new adaptive trust region algorithm for optimization problems, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, A double parameter scaled BFGS method for unconstrained optimization, A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models, Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing, A new modified BFGS method for unconstrained optimization problems, A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing, Convergence analysis of an improved BFGS method and its application in the Muskingum model, Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization, A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems, Limited memory BFGS algorithm for the matrix approximation problem in Frobenius norm, Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions, The projection technique for two open problems of unconstrained optimization problems, Two-step conjugate gradient method for unconstrained optimization, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, A modified nonmonotone BFGS algorithm for unconstrained optimization, The global convergence of a modified BFGS method for nonconvex functions, An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization, A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization, A Modified Nonmonotone Hestenes–Stiefel Type Conjugate Gradient Methods for Large-Scale Unconstrained Problems, A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems, A Trust Region Algorithm with Conjugate Gradient Technique for Optimization Problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Differential optimization techniques
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- Global convergence analysis of a new nonmonotone BFGS algorithm on convex objective functions
- A note on minimization problems and multistep methods
- An SQP-type method and its application in stochastic programs
- The BFGS method with exact line searches fails for non-convex objective functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Local convergence analysis for partitioned quasi-Newton updates
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- New quasi-Newton methods for unconstrained optimization problems
- New line search methods for unconstrained optimization
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
- The “global” convergence of Broyden-like methods with suitable line search
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Testing Unconstrained Optimization Software
- Quasi-Newton Methods, Motivation and Theory
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:
- One-step and multistep procedures for constrained minimization problems
- CUTEr and SifDec
- Variable metric methods of minimisation
- On the Convergence of the Variable Metric Algorithm
- A New Algorithm for Unconstrained Optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.