A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
From MaRDI portal
Publication:2322338
DOI10.1007/S40314-019-0779-7zbMath1438.90329OpenAlexW2898829052WikidataQ128257116 ScholiaQ128257116MaRDI QIDQ2322338
Publication date: 4 September 2019
Published in: Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s40314-019-0779-7
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items (5)
A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization ⋮ A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization ⋮ A class of accelerated subspace minimization conjugate gradient methods ⋮ New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization ⋮ Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
Uses Software
Cites Work
- Some modified conjugate gradient methods for unconstrained optimization
- A Barzilai-Borwein conjugate gradient method
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- Subspace methods for large scale nonlinear equations and nonlinear least squares
- Interpolation by conic model for unconstrained optimization
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Deriving collinear scaling algorithms as extensions of quasi-Newton methods and the local convergence of DFP- and BFGS-related collinear scaling algorithms
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A Modified BFGS Algorithm for Unconstrained Optimization
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Conic Approximations and Collinear Scalings for Optimizers
- The Q-Superlinear Convergence of a Collinear Scaling Algorithm for Unconstrained Optimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Subspace, Interior, and Conjugate Gradient Method for Large-Scale Bound-Constrained Minimization Problems
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- A Subspace Study on Conjugate Gradient Algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- A conic trust-region method for nonlinearly constrained optimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization