A Dynamic Subspace Based BFGS Method for Large Scale Optimization Problem
From MaRDI portal
Publication:6333203
arXiv2001.07335MaRDI QIDQ6333203FDOQ6333203
Authors: Zheng Li, Shi Shu, Jianping Zhang
Publication date: 20 January 2020
Abstract: Large-scale unconstrained optimization is a fundamental and important class of, yet not well-solved problems in numerical optimization. The main challenge in designing an algorithm is to require a few storage locations or very inexpensive computations while preserving global convergence. In this work, we propose a novel approach solving large-scale unconstrained optimization problem by combining the dynamic subspace technique and the BFGS update algorithm. It is clearly demonstrated that our approach has the same rate of convergence in the dynamic subspace as the BFGS and less memory than L-BFGS. Further, we give the convergence analysis by constructing the mapping of low-dimensional Euclidean space to the adaptive subspace. We compare our hybrid algorithm with the BFGS and L-BFGS approaches. Experimental results show that our hybrid algorithm offers several significant advantages such as parallel computing, convergence efficiency, and robustness.
Has companion code repository: https://github.com/LizhengMathAi/F-BFGS
This page was built for publication: A Dynamic Subspace Based BFGS Method for Large Scale Optimization Problem
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6333203)