Limited-memory BFGS systems with diagonal updates
From MaRDI portal
Abstract: In this paper, we investigate a formula to solve systems of the form (B + {sigma}I)x = y, where B is a limited-memory BFGS quasi-Newton matrix and {sigma} is a positive constant. These types of systems arise naturally in large-scale optimization such as trust-region methods as well as doubly-augmented Lagrangian methods. We show that provided a simple condition holds on B_0 and sigma, the system (B + sigma I)x = y can be solved via a recursion formula that requies only vector inner products. This formula has complexity M^2n, where M is the number of L-BFGS updates and n >> M is the dimension of x.
Recommendations
Cites work
- scientific article; zbMATH DE number 3984475 (Why is no real title available?)
- scientific article; zbMATH DE number 3436540 (Why is no real title available?)
- scientific article; zbMATH DE number 852532 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3307153 (Why is no real title available?)
- scientific article; zbMATH DE number 3388490 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- A Subspace Minimization Method for the Trust-Region Step
- A numerical study of limited memory BFGS methods
- A quasi-Newton trust-region method
- A subspace implementation of quasi-Newton trust region methods for unconstrained optimization
- A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
- Computing Optimal Locally Constrained Steps
- Computing a Trust Region Step
- Interior Methods for Nonlinear Optimization
- Iterative Solution of Augmented Systems Arising in Interior Methods
- Iterative methods for finding a trust-region step
- Methods for Modifying Matrix Factorizations
- New least-square algorithms
- Newton’s Method with a Model Trust Region Modification
- On the Inverse of the Sum of Matrices
- On the limited memory BFGS method for large scale optimization
- Quasi-Newton Methods, Motivation and Theory
- Reduced Storage, Quasi-Newton Trust Region Approaches to Function Optimization
- Representations of quasi-Newton matrices and their use in limited memory methods
- The interior-point revolution in optimization: History, recent developments, and lasting consequences
- Trust Region Methods
- Two new unconstrained optimization algorithms which use function and gradient values
- Updating Quasi-Newton Matrices with Limited Storage
Cited in
(7)- On efficiently combining limited-memory and trust-region techniques
- A limited memory BFGS method for a nonlinear inverse problem in digital breast tomosynthesis
- A class of diagonal preconditioners for limited memory BFGS method
- Algorithm 943: MSS: MATLAB software for L-BFGS trust-region subproblems for large-scale optimization
- Nonlinear conjugate gradient method for spectral tomosynthesis
- Shifted L-BFGS systems
- On solving large-scale limited-memory quasi-Newton equations
This page was built for publication: Limited-memory BFGS systems with diagonal updates
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q417600)