A Subspace Study on Conjugate Gradient Algorithms

From MaRDI portal
Publication:4835475


DOI10.1002/zamm.19950750118zbMath0823.65061MaRDI QIDQ4835475

Josef Stoer, Ya-Xiang Yuan

Publication date: 25 June 1995

Published in: ZAMM - Journal of Applied Mathematics and Mechanics / Zeitschrift für Angewandte Mathematik und Mechanik (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1002/zamm.19950750118


65K05: Numerical mathematical programming methods

90C30: Nonlinear programming

90C52: Methods of reduced gradient type


Related Items

A family of quasi-Newton methods for unconstrained optimization problems, A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization, Subspace Methods in Multi-Parameter Seismic Full Waveform Inversion, An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization, Unnamed Item, A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model, A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization, A new subspace minimization conjugate gradient method for unconstrained minimization, An overview of nonlinear optimization, New hybrid conjugate gradient method for unconstrained optimization, A three-term derivative-free projection method for nonlinear monotone system of equations, A Barzilai-Borwein conjugate gradient method, Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization, A Riemannian subspace limited-memory SR1 trust region method, Subspace methods for large scale nonlinear equations and nonlinear least squares, A globally convergent Newton-GMRES method for large sparse systems of nonlinear equations, A subspace conjugate gradient algorithm for large-scale unconstrained optimization, An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization, Acceleration of sequential subspace optimization in Banach spaces by orthogonal search directions, New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization, Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems, A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization, New hybrid conjugate gradient method as a convex combination of LS and FR methods, A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization, A class of accelerated subspace minimization conjugate gradient methods, An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition, A subspace minimization conjugate gradient method based on conic model for unconstrained optimization, A new conjugate gradient method with an efficient memory structure, A subspace SQP method for equality constrained optimization, On efficiently combining limited-memory and trust-region techniques, A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization, An optimal subgradient algorithm with subspace search for costly convex optimization problems, A subspace implementation of quasi-Newton trust region methods for unconstrained optimization


Uses Software


Cites Work