Adaptive scaling damped BFGS method without gradient Lipschitz continuity
From MaRDI portal
Publication:2060902
Recommendations
- An adaptive scaled BFGS method for unconstrained optimization
- An adaptive sizing BFGS method for unconstrained optimization
- A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update
- scientific article; zbMATH DE number 7267270
- A modified scaling BFGS method for nonconvex minimization
- An adaptive Riemannian gradient method without function evaluations
- The adaptive \(s\)-step conjugate gradient method
- An adaptive gradient algorithm for large-scale nonlinear bound constrained optimization
- Damped techniques for the limited memory BFGS method for large-scale optimization
- Adaptive strategy for the damping parameters in an iteratively regularized Gauss-Newton method
Cites work
- scientific article; zbMATH DE number 88930 (Why is no real title available?)
- scientific article; zbMATH DE number 778130 (Why is no real title available?)
- scientific article; zbMATH DE number 3417387 (Why is no real title available?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A modified BFGS method and its global convergence in nonconvex minimization
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- A new approach to variable metric algorithms
- An adaptive scaled BFGS method for unconstrained optimization
- An unconstrained optimization test functions collection
- Benchmarking optimization software with performance profiles.
- CUTE
- Conditioning of Quasi-Newton Methods for Function Minimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Quasi-Newton Methods, Motivation and Theory
- Testing Unconstrained Optimization Software
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- The global convergence of a modified BFGS method for nonconvex functions
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
Cited in
(14)- Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions
- Another Hager-Zhang-type method via singular-value study for constrained monotone equations with application
- An adaptive projection BFGS method for nonconvex unconstrained optimization problems
- A conjugate gradient algorithm without Lipchitz continuity and its applications
- Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization
- Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization
- A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem
- Spectral scaling BFGS method
- The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems
- Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization
- Global convergence of a cautious projection BFGS algorithm for nonconvex problems without gradient Lipschitz continuity
- Solving nonlinear equations with a direct Broyden method and its acceleration
- A new self-scaling memoryless quasi-Newton update for unconstrained optimization
- A structured L-BFGS method and its application to inverse problems
This page was built for publication: Adaptive scaling damped BFGS method without gradient Lipschitz continuity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2060902)