Convergence of descent method without line search (Q2570691): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
ReferenceBot (talk | contribs)
Changed an Item
 
(One intermediate revision by one other user not shown)
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.amc.2004.06.097 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2015080260 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stepsize analysis for descent methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Two-Point Step Size Gradient Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: R-linear convergence of the Barzilai and Borwein gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence properties of the Beale-Powell restart algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence of a two-parameter family of conjugate gradient methods without line search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Conjugate Directions without Linear Searches / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A globally convergent version of the Polak-Ribière conjugate gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class on nonmonotone stabilization methods in unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Quadratically convergent algorithms and one-dimensional search schemes / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Barzilai and Borwein choice of steplength for the gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence of conjugate gradient methods without line search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of line search methods for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4928358 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class of gradient unconstrained minimization algorithms with adaptive stepsize / rank
 
Normal rank

Latest revision as of 18:07, 10 June 2024

scientific article
Language Label Description Also known as
English
Convergence of descent method without line search
scientific article

    Statements

    Convergence of descent method without line search (English)
    0 references
    0 references
    0 references
    28 October 2005
    0 references
    Descent methods of inexact line search type, \(x_{k+1}:= x_k+ \alpha_k d_k\), are considered, where \(d_k\) is the descent direction and \(\alpha_k\) the step size coefficient. No one-dimensional subproblem need to be solved, but explicit formulas for \(\alpha_k\) are presented. Two methods are discussed. The first is applicable if the gradient, \(g\) of the objective \(f\), is Lipschitz and if the Lipschitz constant, \(L\), is easy to estimate. Then the coefficient has the form \(\alpha_k= -g(x_k)^T d_k/(L_k\| d_k\|^2)\) where \(L_k\) are approximations of \(L\) that play a role for the convergence proofs. The second method is applicable if the second derivative of \(f\) is bounded by some constant, say \(M\), and if \(M\) is easy to estimate. In this case, \(\alpha_k= -g(x_k)^T d_k/(M_k\| d_k\|^2)\) where \(M_k\) are approximations of \(M\) that play a role for the convergence proofs. Global convergence as well as linear convergence order of both methods is shown.
    0 references
    0 references
    Unconstrained optimization
    0 references
    Descent method
    0 references
    Line search
    0 references
    global convergence
    0 references
    linear convergence
    0 references
    0 references
    0 references