On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (Q315517): Difference between revisions

From MaRDI portal
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s11590-015-0936-x / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2272094913 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerated gradient methods for nonconvex nonlinear and stochastic programming / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Steepest Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: An optimal method for stochastic composite optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A gradient-based continuous method for large-scale optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3809587 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Universal gradient methods for convex optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smooth minimization of non-smooth functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: Relaxed steepest descent and Cauchy-Barzilai-Borwein method / rank
 
Normal rank
Property / cites work
 
Property / cites work: A nonmonotone spectral projected gradient method for large-scale topology optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal Primal-Dual Methods for a Class of Saddle Point Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Bregman operator splitting with variable stepsize for total variation image reconstruction / rank
 
Normal rank
Property / cites work
 
Property / cites work: An alternating direction approximate Newton algorithm for ill-conditioned inverse problems with application to parallel MRI / rank
 
Normal rank
Property / cites work
 
Property / cites work: A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training / rank
 
Normal rank
Property / cites work
 
Property / cites work: Two-Point Step Size Gradient Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Introductory lectures on convex optimization. A basic course. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5491447 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Barzilai and Borwein choice of steplength for the gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: An efficient gradient method using the Yuan steplength / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Limited Memory Conjugate Gradient Method / rank
 
Normal rank
Property / cites work
 
Property / cites work: A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global Convergence Properties of Conjugate Gradient Methods for Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some descent three-term conjugate gradient methods and their global convergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Conditions for Ascent Methods. II: Some Corrections / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3967358 / rank
 
Normal rank
Property / cites work
 
Property / cites work: First-order methods of smooth convex optimization with inexact oracle / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal methods of smooth convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization / rank
 
Normal rank

Latest revision as of 15:13, 12 July 2024

scientific article
Language Label Description Also known as
English
On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
scientific article

    Statements

    On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (English)
    0 references
    0 references
    21 September 2016
    0 references
    0 references
    nonlinear programming
    0 references
    gradient descent method
    0 references
    global convergence
    0 references
    Hölder continuous gradient
    0 references
    convergence rate
    0 references
    upper complexity bound
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references