Accelerated gradient descent methods with line search (Q5961879): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s11075-009-9350-8 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2014424370 / rank
 
Normal rank
Property / cites work
 
Property / cites work: An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3539529 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimization of functions having Lipschitz continuous first partial derivatives / rank
 
Normal rank
Property / cites work
 
Property / cites work: Two-Point Step Size Gradient Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Alternate step gradient method* / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the asymptotic behaviour of some new gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Modified two-point stepsize gradient methods for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Alternate minimization gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Analysis of monotone gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: R-linear convergence of the Barzilai and Borwein gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Function minimization by conjugate gradients / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient Method with Retards and Generalizations / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Steepest Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: Preconditioned Barzilai-Borwein method for the numerical solution of partial differential equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5652137 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The conjugate gradient method in extremal problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficent line search algorithm for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4107408 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Barzilai and Borwein choice of steplength for the gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convex Analysis / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of line search methods for unconstrained optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A class of gradient unconstrained minimization algorithms with adaptive stepsize / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization theory and methods. Nonlinear programming / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods / rank
 
Normal rank

Latest revision as of 04:52, 3 July 2024

scientific article; zbMATH DE number 5786243
Language Label Description Also known as
English
Accelerated gradient descent methods with line search
scientific article; zbMATH DE number 5786243

    Statements

    Accelerated gradient descent methods with line search (English)
    0 references
    16 September 2010
    0 references
    The authors consider the unconstrained minimization problem on \(\mathbb{R}^n\) with a twice differentiable objective function \(f: \mathbb{R}^n\to\mathbb{R}\). A class of gradient descent methods is based on the multiplication of the iteration step size by an appropriate accleration parameter. The step-size is computed by a line search procedure. The methods of this class are called accelerated gradient descent algorithms with the line search. In the further part of the paper, a special accelerated gradient descent method arising from the Newton method with the line search is proposed. The acceleration parameter of this method is obtained by replacing the Hessian by an appropriately generated diagonal matrix. Linear convergence of the proposed algorithm is proved for uniformly convex objective functions satisfying some additional conditions. Reported numerical results show that the proposed method produces better results than alternative methods known from the literature.
    0 references
    line search
    0 references
    gradient descent methods
    0 references
    Newton method
    0 references
    convergence rate
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers