A new descent algorithm with curve search rule (Q1764727): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Enlarging the region of convergence of Newton's method for constrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Differential gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stepsize analysis for descent methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Study on a supermemory gradient method for the minimization of functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Relation between the memory gradient method and the Fletcher-Reeves method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Conjugate Directions without Linear Searches / rank
 
Normal rank
Property / cites work
 
Property / cites work: New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Using function-values in multi-step quasi-Newton methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum curvature multistep quasi-Newton methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Function minimization by conjugate gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: A globally convergent version of the Polak-Ribière conjugate gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class on nonmonotone stabilization methods in unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5684174 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Augmentability in optimization theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Quadratically convergent algorithms and one-dimensional search schemes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Memory gradient method for the minimization of functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Numerical Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Restart procedures for the conjugate gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some convergence properties of the conjugate gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4928358 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5866904 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2720792 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A note on minimization problems and multistep methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new and dynamic method for unconstrained minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class of gradient unconstrained minimization algorithms with adaptive stepsize / rank
 
Normal rank
Property / cites work
 
Property / cites work: Differential optimization techniques / rank
 
Normal rank
Property / cites work
 
Property / cites work: Supermemory descent methods for unconstrained minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Note on global convergence of ODE methods for unconstrained optimization / rank
 
Normal rank

Latest revision as of 17:38, 7 June 2024

scientific article
Language Label Description Also known as
English
A new descent algorithm with curve search rule
scientific article

    Statements

    A new descent algorithm with curve search rule (English)
    0 references
    0 references
    0 references
    22 February 2005
    0 references
    A globally convergent curve search algorithm for solving unconstrained minimization problems is developed. The curves which underly the step direction and step size procedure at each iteration are rational expressions in the curve parameter \(\alpha\). Nominator and denominator depend linearly on \(\alpha\). There exist some similarities with conjugate gradient methods, and Wolfe's line search rules are considered at the step procedures, too. Numerical experiments allow to compare the method proposed with some standard algorithms.
    0 references
    0 references
    Unconstrained minimization
    0 references
    Descent method
    0 references
    Curve search rule
    0 references
    Global convergence
    0 references
    Conjugate gradient method
    0 references
    Numerical experiments
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers