An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
From MaRDI portal
(Redirected from Publication:457047)
Recommendations
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- A New Adaptive Subspace Minimization Three-Term Conjugate Gradient Algorithm for Unconstrained Optimization
- A three-term conjugate gradient algorithm using subspace for large-scale unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
Cites work
- scientific article; zbMATH DE number 992790 (Why is no real title available?)
- scientific article; zbMATH DE number 992793 (Why is no real title available?)
- scientific article; zbMATH DE number 3843083 (Why is no real title available?)
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 1243473 (Why is no real title available?)
- scientific article; zbMATH DE number 3439906 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Numerical Study of the Limited Memory BFGS Method and the Truncated-Newton Method for Large Scale Optimization
- A Subspace Study on Conjugate Gradient Algorithms
- A Two-Term PRP-Based Descent Method
- A conjugate direction algorithm without line searches
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization
- A note on the convergence properties of the original three-term Hestenes-Stiefel method
- A survey of nonlinear conjugate gradient methods
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
- Accelerated gradient descent methods with line search
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- An unconstrained optimization test functions collection
- Benchmarking optimization software with performance profiles.
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Efficient generalized conjugate gradient algorithms. I: Theory
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- Some descent three-term conjugate gradient methods and their global convergence
- Some numerical experiments with variable-storage quasi-Newton algorithms
- The conjugate gradient method in extremal problems
- Updating Quasi-Newton Matrices with Limited Storage
Cited in
(27)- A class of accelerated subspace minimization conjugate gradient methods
- A New Adaptive Subspace Minimization Three-Term Conjugate Gradient Algorithm for Unconstrained Optimization
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A new subspace minimization conjugate gradient method for unconstrained minimization
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- Some three-term conjugate gradient methods with the inexact line search condition
- A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
- An improved three-term conjugate gradient algorithm for solving unconstrained optimization problems
- An improved three-dimensional subspace minimization conjugate gradient method
- Continuous and discrete Zhang dynamics for real-time varying nonlinear optimization
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- A hybrid Riemannian conjugate gradient method for nonconvex optimization problems
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- A three-term conjugate gradient algorithm using subspace for large-scale unconstrained optimization
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- On three-term conjugate gradient algorithms for unconstrained optimization
- scientific article; zbMATH DE number 641552 (Why is no real title available?)
- Comments on ``Another hybrid conjugate gradient algorithm for unconstrained optimization by Andrei
- A three-term conjugate gradient method with accelerated subspace quadratic optimization
- Design and analysis of two discrete-time ZD algorithms for time-varying nonlinear minimization
This page was built for publication: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q457047)