A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
From MaRDI portal
Publication:6141538
Abstract: In this paper, based on the limited memory techniques and subspace minimization conjugate gradient (SMCG) methods, a regularized limited memory subspace minimization conjugate gradient method is proposed, which contains two types of iterations. In SMCG iteration, we obtain the search direction by minimizing the approximate quadratic model or approximate regularization model. In RQN iteration, combined with regularization technique and BFGS method, a modified regularized quasi-Newton method is used in the subspace to improve the orthogonality. Moreover, some simple acceleration criteria and an improved tactic for selecting the initial stepsize to enhance the efficiency of the algorithm are designed. Additionally, an generalized nonmonotone line search is utilized and the global convergence of our proposed algorithm is established under mild conditions. Finally, numerical results show that, the proposed algorithm has a significant improvement over ASMCG_PR and is superior to the particularly well-known limited memory conjugate gradient software packages CG_DESCENT (6.8) and CGOPT(2.0) for the CUTEr library.
Recommendations
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A regularized limited memory BFGS method for nonconvex unconstrained minimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- A new regularized quasi-Newton method for unconstrained optimization
Cites work
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A Barzilai-Borwein conjugate gradient method
- A Globally and Superlinearly Convergent Gauss--Newton-Based BFGS Method for Symmetric Nonlinear Equations
- A Modified BFGS Algorithm for Unconstrained Optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Subspace Study on Conjugate Gradient Algorithms
- A class of accelerated subspace minimization conjugate gradient methods
- A new nonmonotone line search technique for unconstrained optimization
- A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations
- A regularized limited memory BFGS method for nonconvex unconstrained minimization
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A survey of nonlinear conjugate gradient methods
- Algorithm 851
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- Benchmarking optimization software with performance profiles.
- CUTEr and SifDec
- Convergence properties of the regularized Newton method for the unconstrained nonconvex optimization
- Descent Directions of Quasi-Newton Methods for Symmetric Nonlinear Equations
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- Modified two-point stepsize gradient methods for unconstrained optimization
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- Numerical Optimization
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- On the limited memory BFGS method for large scale optimization
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- The Limited Memory Conjugate Gradient Method
- The conjugate gradient method in extremal problems
- Two-Point Step Size Gradient Methods
- Updating Quasi-Newton Matrices with Limited Storage
Cited in
(3)- Limited memory restarted \(\ell^p\)-\(\ell^q\) minimization methods using generalized Krylov subspaces
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A subspace derivative-free projection method for convex constrained nonlinear equations
This page was built for publication: A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6141538)