A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
From MaRDI portal
Publication:2204182
DOI10.1007/s40314-020-01301-9zbMath1463.90206OpenAlexW3051292649MaRDI QIDQ2204182
Publication date: 15 October 2020
Published in: Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s40314-020-01301-9
global convergenceconjugate gradient methodsubspace minimizationmodified secant equationmodified nonmonotone Wolfe line search
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Barzilai-Borwein conjugate gradient method
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- A Riemannian subspace limited-memory SR1 trust region method
- New quasi-Newton methods via higher order tensor models
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- Subspace methods for large scale nonlinear equations and nonlinear least squares
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems
- New quasi-Newton equation and related methods for unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- AN ADAPTIVE CONJUGACY CONDITION AND RELATED NONLINEAR CONJUGATE GRADIENT METHODS
- A Modified BFGS Algorithm for Unconstrained Optimization
- A Subspace Minimization Method for the Trust-Region Step
- Two-Point Step Size Gradient Methods
- Numerical Optimization
- A Subspace Study on Conjugate Gradient Algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Subspace Trust‐Region Methods for Large Bound‐Constrained Nonlinear Equations
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
- A new efficient conjugate gradient method for unconstrained optimization
This page was built for publication: A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization