A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization

From MaRDI portal
Publication:6141538

DOI10.1007/S11075-023-01559-0arXiv2301.02863MaRDI QIDQ6141538FDOQ6141538

Author name not available (Why is that?)

Publication date: 19 December 2023

Published in: Numerical Algorithms (Search for Journal in Brave)

Abstract: In this paper, based on the limited memory techniques and subspace minimization conjugate gradient (SMCG) methods, a regularized limited memory subspace minimization conjugate gradient method is proposed, which contains two types of iterations. In SMCG iteration, we obtain the search direction by minimizing the approximate quadratic model or approximate regularization model. In RQN iteration, combined with regularization technique and BFGS method, a modified regularized quasi-Newton method is used in the subspace to improve the orthogonality. Moreover, some simple acceleration criteria and an improved tactic for selecting the initial stepsize to enhance the efficiency of the algorithm are designed. Additionally, an generalized nonmonotone line search is utilized and the global convergence of our proposed algorithm is established under mild conditions. Finally, numerical results show that, the proposed algorithm has a significant improvement over ASMCG_PR and is superior to the particularly well-known limited memory conjugate gradient software packages CG_DESCENT (6.8) and CGOPT(2.0) for the CUTEr library.


Full work available at URL: https://arxiv.org/abs/2301.02863







Cites Work


Cited In (3)





This page was built for publication: A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6141538)