Convergence analysis of a modified BFGS method on convex minimizations
From MaRDI portal
Recommendations
- A modified BFGS method and its global convergence in nonconvex minimization
- Analysis of a quasi-Newton method for unconstrained optimization
- scientific article; zbMATH DE number 700611
- Global convergence of a modified limited memory BFGS method for non-convex minimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
Cites work
- scientific article; zbMATH DE number 991654 (Why is no real title available?)
- scientific article; zbMATH DE number 3928227 (Why is no real title available?)
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 1243473 (Why is no real title available?)
- scientific article; zbMATH DE number 1131990 (Why is no real title available?)
- scientific article; zbMATH DE number 3453051 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A New Algorithm for Unconstrained Optimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- A modified BFGS method and its global convergence in nonconvex minimization
- A new line search method with trust region for unconstrained optimization
- A note on minimization problems and multistep methods
- An SQP-type method and its application in stochastic programs
- Benchmarking optimization software with performance profiles.
- CUTEr and SifDec
- Convergence Properties of the BFGS Algoritm
- Differential optimization techniques
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Global convergece of the bfgs algorithm with nonmonotone linesearch∗∗this work is supported by national natural science foundation$ef:
- Global convergence analysis of a new nonmonotone BFGS algorithm on convex objective functions
- Global convergence of the partitioned BFGS algorithm for convex partially separable optimization
- Local convergence analysis for partitioned quasi-Newton updates
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- New line search methods for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- On the Convergence of the Variable Metric Algorithm
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- One-step and multistep procedures for constrained minimization problems
- Quasi-Newton Methods, Motivation and Theory
- Testing Unconstrained Optimization Software
- The BFGS method with exact line searches fails for non-convex objective functions
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- The “global” convergence of Broyden-like methods with suitable line search
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
- Variable metric methods of minimisation
Cited in
(67)- A modified HS-DY-type method with nonmonotone line search for image restoration and unconstrained optimization problems
- An adaptive projection BFGS method for nonconvex unconstrained optimization problems
- A conjugate gradient algorithm without Lipchitz continuity and its applications
- scientific article; zbMATH DE number 2092232 (Why is no real title available?)
- Using a modied secant equation for unconstrained optimization
- A modified two-parameter scaled Broyden-type algorithm for unconstrained optimization problems
- scientific article; zbMATH DE number 5723677 (Why is no real title available?)
- An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems
- scientific article; zbMATH DE number 5583295 (Why is no real title available?)
- A conjugate gradient method with descent direction for unconstrained optimization
- A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems
- A tensor trust-region model for nonlinear system
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A trust region algorithm with conjugate gradient technique for optimization problems
- The convergence of a new modified BFGS method without line searches for unconstrained optimization or complexity systems
- A modified secant equation quasi-Newton method for unconstrained optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- The superlinear convergence analysis of a nonmonotone BFGS algorithm on convex objective functions
- The global convergence of the BFGS method with a modified WWP line search for nonconvex functions
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- Analysis of a quasi-Newton method for unconstrained optimization
- A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing
- A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule
- Modified three-term Liu-Storey conjugate gradient method for solving unconstrained optimization problems and image restoration problems
- A modified nonmonotone BFGS algorithm for unconstrained optimization
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing
- A new type of quasi-Newton updating formulas based on the new quasi-Newton equation
- A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems
- A new adaptive trust region algorithm for optimization problems
- Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing
- Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- Two-step conjugate gradient method for unconstrained optimization
- Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A modified nonmonotone hestenes-Stiefel type conjugate gradient methods for large-scale unconstrained problems
- scientific article; zbMATH DE number 4123183 (Why is no real title available?)
- A hybrid scaling parameter for the scaled memoryless BFGS method based on the \(\ell_\infty\) matrix norm
- A hybrid quasi-Newton method with application in sparse recovery
- Convergence of the BFGS Method for $LC^1 $ Convex Constrained Optimization
- A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration
- On convergence and complexity of the modified forward‐backward method involving new linesearches for convex minimization
- Global convergence properties of two modified BFGS-type methods
- Improving the convergence behaviour of BiCGSTAB by applying <i>D</i>-norm minimization
- An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization
- A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models
- A BFGS algorithm for solving symmetric nonlinear equations
- Limited memory BFGS algorithm for the matrix approximation problem in Frobenius norm
- Global convergence of a modified Broyden family method for nonconvex functions
- A double parameter scaled BFGS method for unconstrained optimization
- The projection technique for two open problems of unconstrained optimization problems
- A new modified BFGS method for unconstrained optimization problems
- A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems
- An active set limited memory BFGS algorithm for bound constrained optimization
- Convergence analysis of an improved BFGS method and its application in the Muskingum model
- The global convergence of a modified BFGS method for nonconvex functions
- A quasi-Newton algorithm for large-scale nonlinear equations
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
- A modified stochastic quasi-Newton algorithm for summing functions problem in machine learning
- A conjugate gradient method based on a modified secant relation for unconstrained optimization
- The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems
- A survey of gradient methods for solving nonlinear optimization
- A modified Hestenes-Stiefel conjugate gradient algorithm for large-scale optimization
This page was built for publication: Convergence analysis of a modified BFGS method on convex minimizations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q711385)