An efficient line search for nonlinear least squares (Q1057187): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Removed claims
Set OpenAlex properties.
 
(4 intermediate revisions by 4 users not shown)
Property / author
 
Property / author: Mehiddin Al-Baali / rank
 
Normal rank
Property / author
 
Property / author: Roger Fletcher / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: BRENT / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Steepest Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3882253 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Variational Methods for Non-Linear Least-Squares / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5657612 / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/bf00940566 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2050786966 / rank
 
Normal rank

Latest revision as of 10:32, 30 July 2024

scientific article
Language Label Description Also known as
English
An efficient line search for nonlinear least squares
scientific article

    Statements

    An efficient line search for nonlinear least squares (English)
    0 references
    1986
    0 references
    The line search subproblem in unconstrained optimization is concerned with finding an acceptable steplength which satisfies certain standard conditions. Prototype algorithms are described which guarantee to find such a step in a finite number of operations. This is achieved by first bracketing an interval of acceptable values and then reducing this bracket uniformly by the repeated use of sectioning in a systematic way. Some new theorems about convergence and termination of the line search are presented. Use of these algorithms to solve the line search subproblem in methods for nonlinear least squares is considered. We show that substantial gains in efficiency can be made by making polynomial interpolations to the individual residual functions rather than the overall objective function. We also study modified schemes in which the Jacobian matrix is evaluated as infrequently as possible, and show that further worthwhile savings can be made. Numerical results are presented.
    0 references
    line search subproblem
    0 references
    unconstrained optimization
    0 references
    sectioning
    0 references
    nonlinear least squares
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references