Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization
From MaRDI portal
Publication:6174641
DOI10.1080/02331934.2022.2048381OpenAlexW4224046108MaRDI QIDQ6174641FDOQ6174641
Razieh Dehghani, Author name not available (Why is that?), M. J. Ebadi, Author name not available (Why is that?)
Publication date: 14 July 2023
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2022.2048381
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Benchmarking optimization software with performance profiles.
- A Family of Variable-Metric Methods Derived by Variational Means
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- Optimization theory and methods. Nonlinear programming
- Quasi-Newton Methods, Motivation and Theory
- New quasi-Newton methods for unconstrained optimization problems
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- New quasi-Newton equation and related methods for unconstrained optimization
- The BFGS method with exact line searches fails for non-convex objective functions
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Convergence Properties of the BFGS Algoritm
- Convergence analysis of a modified BFGS method on convex minimizations
- Two new conjugate gradient methods based on modified secant equations
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- New quasi-Newton methods via higher order tensor models
- Multi-step quasi-Newton methods for optimization
- Alternating multi-step quasi-Newton methods for unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- The global convergence of a modified BFGS method for nonconvex functions
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- A nonlinear model for function-value multistep methods
Cited In (2)
This page was built for publication: Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6174641)