Accelerated gradient descent methods with line search (Q5961879)

From MaRDI portal





scientific article; zbMATH DE number 5786243
Language Label Description Also known as
default for all languages
No label defined
    English
    Accelerated gradient descent methods with line search
    scientific article; zbMATH DE number 5786243

      Statements

      Accelerated gradient descent methods with line search (English)
      0 references
      16 September 2010
      0 references
      The authors consider the unconstrained minimization problem on \(\mathbb{R}^n\) with a twice differentiable objective function \(f: \mathbb{R}^n\to\mathbb{R}\). A class of gradient descent methods is based on the multiplication of the iteration step size by an appropriate accleration parameter. The step-size is computed by a line search procedure. The methods of this class are called accelerated gradient descent algorithms with the line search. In the further part of the paper, a special accelerated gradient descent method arising from the Newton method with the line search is proposed. The acceleration parameter of this method is obtained by replacing the Hessian by an appropriately generated diagonal matrix. Linear convergence of the proposed algorithm is proved for uniformly convex objective functions satisfying some additional conditions. Reported numerical results show that the proposed method produces better results than alternative methods known from the literature.
      0 references
      line search
      0 references
      gradient descent methods
      0 references
      Newton method
      0 references
      convergence rate
      0 references
      0 references
      0 references
      0 references
      0 references

      Identifiers