An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
From MaRDI portal
Publication:457047
DOI10.1007/s11075-013-9718-7zbMath1301.65041OpenAlexW1979539583MaRDI QIDQ457047
Publication date: 26 September 2014
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-013-9718-7
algorithmunconstrained optimizationglobal convergenceconvex functionconjugacy conditionnumerical comparisonslarge scaledescent conditionthree-term conjugate gradient
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items
A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization, Continuous and discrete Zhang dynamics for real-time varying nonlinear optimization, A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization, A subspace conjugate gradient algorithm for large-scale unconstrained optimization, A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization, A new subspace minimization conjugate gradient method for unconstrained minimization, A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization, A hybrid Riemannian conjugate gradient method for nonconvex optimization problems, An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization, A class of accelerated subspace minimization conjugate gradient methods, New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction, Design and analysis of two discrete-time ZD algorithms for time-varying nonlinear minimization, Some three-term conjugate gradient methods with the inexact line search condition, New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization, A subspace minimization conjugate gradient method based on conic model for unconstrained optimization, A three-term conjugate gradient method with accelerated subspace quadratic optimization, Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems, Comments on ``Another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Some numerical experiments with variable-storage quasi-Newton algorithms
- On the limited memory BFGS method for large scale optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- A conjugate direction algorithm without line searches
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
- A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Updating Quasi-Newton Matrices with Limited Storage
- A Numerical Study of the Limited Memory BFGS Method and the Truncated-Newton Method for Large Scale Optimization
- A Subspace Study on Conjugate Gradient Algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- A Two-Term PRP-Based Descent Method
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- Accelerated gradient descent methods with line search