Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization (Q670492)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization |
scientific article |
Statements
Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization (English)
0 references
18 March 2019
0 references
The authors investigate a block version of the BFGS variable metric update formula for large-scale unconstrained optimization problems with general functions. This formula satisfies the quasi-Newton conditions with all used difference vectors and gives the best possible improvement of convergence in some sense for quadratic objective functions. Properties, modifications, and some connections with methods based on vector corrections from previous iterations for conjugacy are presented. A new method has a lot of interesting properties but it does not guarantee that the corresponding vectors are descent. Thus, it must be modified in order to be applicable to general (not only quadratic) functions. To overcome the difficulty with descent directions, a block version of the limited-memory BNS method which utilizes advantageous properties of the block BFGS update is proposed. A convenient formula representing the resulting variable metric matrix and a relating formula for efficient calculation of the direction vector are derived. Starting with an initial approximation, the inverse Hessian matrix is repeatedly updated (without forming explicitly) using m couples of vectors: the difference of two consecutive iterations and the difference of two consecutive gradients. Extensive numerical experiments performed on two collections of test problems indicate that the block approach can improve the results significantly in comparison with the frequently used L-BFGS and BNS methods. Global convergence of the algorithm for convex sufficiently smooth functions is proved. The paper is very technical, complex formulas and heavy notation used make reading quite demanding. It is not intended for beginners but for those who already have some experience in the field.
0 references
unconstrained optimization
0 references
large-scale problems
0 references
block variable metric methods
0 references
limited-memory methods
0 references
BFGS update
0 references
BNS method
0 references
quasi-Newton conditions
0 references
algorithms
0 references
global convergence
0 references
numerical experiments
0 references