Truncated Newton method for sparse unconstrained optimization using automatic differentiation (Q1095800): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
 
(One intermediate revision by one other user not shown)
Property / cites work
 
Property / cites work: Truncated-Newton algorithms for large-scale unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Automatic differentiation: techniques and applications / rank
 
Normal rank
Property / cites work
 
Property / cites work: Inexact Newton Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Conjugate Gradient Method and Trust Regions in Large Scale Optimization / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/bf00940007 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2007007407 / rank
 
Normal rank

Latest revision as of 10:47, 30 July 2024

scientific article
Language Label Description Also known as
English
Truncated Newton method for sparse unconstrained optimization using automatic differentiation
scientific article

    Statements

    Truncated Newton method for sparse unconstrained optimization using automatic differentiation (English)
    0 references
    0 references
    0 references
    1989
    0 references
    When solving large complex optimization problems, the user is faced with three major problems. These are: (i) the cost in human time in obtaining accurate expressions for the derivatives involved; (ii) the need to store second derivative information; and (iii) of lessening importance, the time taken to solve the problem on the computer. For many problems, a significant part of the latter can be attributed to solving Newton-like equations. In the algorithm described, the equations are solved using a conjugate direction method which only needs the Hessian at the current point when it is multiplied by a trial vector. In this paper, we present a method that finds this product using automatic differentiation while only requiring vector storage. The method takes advantage of any sparsity in the Hessian matrix and computes exact derivatives. It avoids the complexity of symbolic differentiation, the inaccuracy of numerical differentiation, the labor of finding analytic derivatives, and the need for matrix store. When far from a minimum, an accurate solution to the Newton equations is not justified, so an approximate solution is obtained by using a version of \textit{R. Dembo} and \textit{T. Steihaug}'s truncated Newton algorithm [Math. Program. 26, 190-212 (1982; Zbl 0523.90078)].
    0 references
    large complex optimization
    0 references
    conjugate direction method
    0 references
    automatic differentiation
    0 references
    sparsity
    0 references
    Hessian matrix
    0 references
    approximate solution
    0 references
    truncated Newton algorithm
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references