Truncated Newton method for sparse unconstrained optimization using automatic differentiation (Q1095800)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Truncated Newton method for sparse unconstrained optimization using automatic differentiation |
scientific article |
Statements
Truncated Newton method for sparse unconstrained optimization using automatic differentiation (English)
0 references
1989
0 references
When solving large complex optimization problems, the user is faced with three major problems. These are: (i) the cost in human time in obtaining accurate expressions for the derivatives involved; (ii) the need to store second derivative information; and (iii) of lessening importance, the time taken to solve the problem on the computer. For many problems, a significant part of the latter can be attributed to solving Newton-like equations. In the algorithm described, the equations are solved using a conjugate direction method which only needs the Hessian at the current point when it is multiplied by a trial vector. In this paper, we present a method that finds this product using automatic differentiation while only requiring vector storage. The method takes advantage of any sparsity in the Hessian matrix and computes exact derivatives. It avoids the complexity of symbolic differentiation, the inaccuracy of numerical differentiation, the labor of finding analytic derivatives, and the need for matrix store. When far from a minimum, an accurate solution to the Newton equations is not justified, so an approximate solution is obtained by using a version of \textit{R. Dembo} and \textit{T. Steihaug}'s truncated Newton algorithm [Math. Program. 26, 190-212 (1982; Zbl 0523.90078)].
0 references
large complex optimization
0 references
conjugate direction method
0 references
automatic differentiation
0 references
sparsity
0 references
Hessian matrix
0 references
approximate solution
0 references
truncated Newton algorithm
0 references