Adaptive scaling damped BFGS method without gradient Lipschitz continuity
From MaRDI portal
Publication:2060902
DOI10.1016/J.AML.2021.107634zbMATH Open1482.90175OpenAlexW3197176902MaRDI QIDQ2060902FDOQ2060902
Authors: Mengxiang Zhang, Yingjie Zhou, Gonglin Yuan
Publication date: 13 December 2021
Published in: Applied Mathematics Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.aml.2021.107634
Recommendations
- An adaptive scaled BFGS method for unconstrained optimization
- An adaptive sizing BFGS method for unconstrained optimization
- A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update
- scientific article; zbMATH DE number 7267270
- A modified scaling BFGS method for nonconvex minimization
- An adaptive Riemannian gradient method without function evaluations
- The adaptive \(s\)-step conjugate gradient method
- An adaptive gradient algorithm for large-scale nonlinear bound constrained optimization
- Damped techniques for the limited memory BFGS method for large-scale optimization
- Adaptive strategy for the damping parameters in an iteratively regularized Gauss-Newton method
Cites Work
- Testing Unconstrained Optimization Software
- CUTE
- Benchmarking optimization software with performance profiles.
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- An unconstrained optimization test functions collection
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Quasi-Newton Methods, Motivation and Theory
- Title not available (Why is that?)
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- A modified BFGS method and its global convergence in nonconvex minimization
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Title not available (Why is that?)
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- An adaptive scaled BFGS method for unconstrained optimization
- The global convergence of a modified BFGS method for nonconvex functions
- The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions
- Title not available (Why is that?)
Cited In (14)
- A new self-scaling memoryless quasi-Newton update for unconstrained optimization
- A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem
- Solving nonlinear equations with a direct Broyden method and its acceleration
- Another Hager-Zhang-type method via singular-value study for constrained monotone equations with application
- A conjugate gradient algorithm without Lipchitz continuity and its applications
- Spectral scaling BFGS method
- Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization
- Global convergence of a cautious projection BFGS algorithm for nonconvex problems without gradient Lipschitz continuity
- Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization
- Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions
- An adaptive projection BFGS method for nonconvex unconstrained optimization problems
- Modified globally convergent Polak-Ribière-Polyak conjugate gradient methods with self-correcting property for large-scale unconstrained optimization
- The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems
- A structured L-BFGS method and its application to inverse problems
This page was built for publication: Adaptive scaling damped BFGS method without gradient Lipschitz continuity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2060902)