An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems
DOI10.1051/ro/2017057zbMath1461.65182OpenAlexW2739392069MaRDI QIDQ5242267
Saman Babaie-Kafaki, Saeed Rezaee
Publication date: 6 November 2019
Published in: RAIRO - Operations Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1051/ro/2017057
unconstrained optimizationglobal convergencesuperlinear convergencetrust region methodadaptive radiussecant equation
Numerical mathematical programming methods (65K05) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (1)
Uses Software
Cites Work
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- A nonmonotone trust region method with adaptive radius for unconstrained optimization problems
- A self-adaptive trust region method with line search based on a simple subproblem model
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Accelerating the cubic regularization of Newton's method on convex problems
- A nonmonotone trust region method based on simple quadratic models
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- Cubic regularization of Newton method and its global performance
- Optimization theory and methods. Nonlinear programming
- An adaptive trust region method and its convergence
- A new nonmonotone adaptive trust region method based on simple quadratic models
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- An adaptive nonmonotone trust region method for unconstrained optimization problems based on a simple subproblem
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- The Quasi-Cauchy Relation and Diagonal Updating
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- CUTEr and SifDec
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems